var/home/core/zuul-output/0000755000175000017500000000000015154401421014522 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015154404335015475 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000207212615154404163020263 0ustar corecoresikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gfͅ )?KEڤ펯_ˎ6_o#oVݏKf핷ox[o8W5-b6"οƼ>UWm׫Y_?|uݗ[y[L-V_pY_P-bXwûxwAۋt[~ _P^~&RY,yDy~z]fs,l<L& " d :o5J=nJw1f /%\xiƙQʀClxv< |N ?%5$) y5? fۮ?tT)x[@Y[`VQYY0gr.W9{r&r%LӶ`zV=Too|@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߗ|Hp(-J C?>:zR{܃ lM6_Oފ?O1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾg}\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&VOK(BSsǽҰ%>kh5nIYk'LVc(a<1mCޢmp.֣?5t罦X[nMcow&|||x:k/.EoV%#?%W۱`3fs䓯ҴgqmubIfp$HhtLzܝ6rq/nLN?2Ǒ|;C@,UѩJ:|n^/GSZ;m#Nvd?PqTcLQMhg:F[bTm!V`AqPaPheUJ& z?NwpGj{VjQS,؃I'[y~EQ(S +mpN, Mq 70eP/d bP6k:Rǜ%V1Ȁ Z(Q:IZaP,MI6o ޞ22ݡjR:g?m@ڤB^dh NS߿c9e#C _-XѪ;Ʃ2tStΆ,~Lp`-;uIBqBVlU_~F_+ERz#{)@o\!@q['&&$"THl#d0 %L+`8zOҚƞ`wF~;~pkѽ)'cL@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2ahwfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O 0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 (7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imH'Y`;5{$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iħ1 {\FPG"$$ {+!˨?EP' =@~edF \r!٤ã_e=P1W3c +A)9V ]rVmeK\4? 8'*MTox6[qn2XwK\^-ޖA2U]E_Dm5^"d*MQǜq؈f+C/tfRxeKboc5Iv{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P?~݆]Ōvton5 lvǫV*k*5]^RFlj]R#Uz |wmTeM kuu8@8/X[1fiMiT+9[ŗ6 BN=rR60#tE#u2k *+e7[YU6Msj$wբh+8kMZY9X\u7Kp:׽ ^҃5M>!6~ö9M( Pnuݮ)`Q6eMӁKzFZf;5IW1i[xU 0FPM]gl}>6sUDO5f p6mD[%ZZvm̓'!n&.TU n$%rIwP(fwnv :Nb=X~ax`;Vw}wvRS1q!z989ep 5w%ZU.]5`s=r&v2FaUM 6/"IiBSpp3n_9>Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6jm3Kx6BDhvzZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,W˲vtv|,SԽ[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXskwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^, tJbظ&Pg%㢒\QS܁vn` *3UP0Sp8:>m(Zx ,c|!0=0{ P*27ެT|A_mnZ7sDbyT'77J6:ѩ> EKud^5+mn(fnc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8G+mj(^>c/"ɭex^k$# $V :]PGszyEZ]DaUS@''mhSt6"+ҶT %e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P߼fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs =[F۴'c,QAIٰ9JXOz);B= @%AIt0v[Ƿ&FJE͙A~IQ%iShnMІt.޿>q=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9pM&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1Qzi#f>夑3KմԔ萴%|xyr>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&On?7\7ix@ D_P"~GijbɠM&HtpR:4Si גt&ngb9%islԃ)Hc`ebw|Ī Zg_0FRYeO:F)O>UD;;MY,2ڨi"R"*R2s@AK/u5,b#u>cY^*xkJ7C~pۊ ~;ɰ@ՙ.rT?m0:;}d8ۈ ݨW>.[Vhi̒;̥_9$W!p.zu~9x۾vC;kN?WƟ+fx3SuKQqxST Ζ2%?T74a{N8;lr`$pZds=3jwlL Eڲ t|*n8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3q_M]j 8E!@1vցP7!|+R@;HspSI]ڻCZUcg5pDcIϹ,oN-_XI,3\j ]ٟ5~' SuipA!C厐$&k7dmhz/#"݃,YqCL$ڲ`"MUbeT>Xuv~4Le͢ }UVM)[A`b}mcE]LCEg=2ȴcmZ?E*-8nhױ1xR2ϫCya` A y!?h!9yL%VLU2gr26A!4vbSG ]ꧧWp/ &ee *w$-`J\ ptǣC^p#_`{ К8EW>*(D{ٛ,[fnY𱹞M=6&$<,"lX-Ǐ_whaE 98 (oѢ/Р΅ 7ցl6618ł_1/=fu).s¯?.S[{'g=Ҥ):d8h\y6]t1T7IUV:;.1& ,5΀j:<< +Y?58In'bXIǣO{&V\DŽ0,9f O_"[l:h¢8wݓ19\:f6:+ .3}=uvKc ٹeS<>ij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBKb+{P.T! =ĦiTob d<>SHr][KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F?hr,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[R!Y1Qi F>wTLHUGӃ\ԗu`5[Z!]nlnݔm\Ǡ*_lDni9V0ټ_`#U8VdTtD_*EX-sJb?U'3X7J4l+Cj%LPcm${|Xdu4tmtїUJ\~dc0KcMlf2ǝyW^OiXC٩ȦD\!~s7[ NRCǔd X13։: F]agB-:%ގީ׵Oj|Yb:.͘C4z 6qe6J61R$Eh3ŕS,|HVQ6~ۮ 馏SVL l)v}Yg%1C+t;H+NL$k~:$TiVD7ֶ]cga@>\X=4OZS׹U>8bK0%V\ t!Lku`+]c0h&)IVC)p| QUA:]XL/2La[Xѓ F;/-rtx-rei0hE˝ݸDt#{I} `v;jUvK S x1Q2XU&6k&lE"} Q\E)+u>.,SzbQ!g:l0r5aI`"Ǒm O\B!,ZDbjKM%q%Em(>Hm 2z=Eh^&hBk X%t>g:Y #)#vǷOV't d1 =_SEp+%L1OUaY쎹aZNnDZ6fV{r&ȑ|X!|i*FJT+gj׾,$'qg%HWc\4@'@—>9V*E :lw)e6;KK{s`>3X: P/%d1ؑHͦ4;W\hx锎vgqcU!}xF^jc5?7Ua,X nʬ^Cv'A$ƝKA`d;_/EZ$*"ȜH*Duƽ˳bKg^raͭ̍*tPu*9bJ_ ;3It+v;3O'CX}k:U{⧘pvzz0V Y3'Dco\:^dnJF7a)AH v_§gbȩ<+S-EasUNfB7™:%WY ]LXg3۾4\.?}f kj·dMGKaVۿ$XD'QǛU>UӸRR?xYTE.1?VwխmLaF݄_",Uy%íaz,/ooZ^]ݖF\\UR7򱺹...^'w o7nEw!7xU!|q˯b_=-[~Mp?Cz .e7"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKooa_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ'`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ6NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"re5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSPW k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&d_&i"L{vrQۻu}q}hn~+.gpWEqws]],ǫ\,J.MLmc /ԗWrU,Ǜ+sXn[ﯾeywyY]]¨Kpx c./mv>9"Th $LqQjiXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1M'I`22؆DT!/j璓P åiw@wgRCsTI|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰN{W6'_ʖD:٦ݜܝ %RndI'_AIv4)Cmݨ=sK  \#*H&dbu%Hcqw?)*L*)$MbJi>e~jpq'T?V]xBcUB4YE\ݪ)GiUQJp̓n3RbTꢬƧe aJ3:8ur,a| B7 N٧a,x";KVaKoT&(dHΙg }>8Mi+Ӿ+]SbZ?DhfNâT+ `k0Áϛ'f6qKB!ߟe2y\[՗XoO<6#R?DH~Cyg-/gtdvQ;"%9zfKZ.⨚=L%=2۱jŒz*,,Fs^-z1-pvКwVqQzFϬq]#Zv+pB?1mBʄXYSGҍ<ߎBȨ͓,2= ܮ+4|=)uWGL"ZluY$YȓY ȴ)xlibCq`WVO!& 9alX3 Px) _eK"`N.,*/_rV5k[uU{+I%$NU:L-EkY/TU!}KROsvYBL[eUկ_r1'`! J'Q]~^^h_YZuS  ¹r:9k~~΋@j29HIKh k O?{>#] |ymL;q&G}>iLk۲}N %S'(]^ ѱ4U^'ei UWQ۵膻HrWsXq#!ߓ:U;UZ.dx})GMHx 2TG)a] %vmke%v/OpPfϫp`_:"\IxyJGwP58i| Dm\>#o\kz- tQ+ǯ*m % uڄ(ۋ7YV5^~l >ژj31Mt6JLku=+y"Vk)`K{?=~u^mTB<|x`3ˡX z18sE7`$~-XSYnotBbpf rzNӕs$Dݵ_;8 ,]ʗqjowծ]3lIO1>(yYy|9YB(F7ޝ#]o rm#g}0Qėy1lmW^4ũmi>H\ɐ to}a}R/S?uVqtziYU.vbDoqNTXw'BY#sC>#x#7RuZHBҰJ_ڎnwˬ. uHq<̱lӿF.I#DzF>8U'*7ndȲ㣈'%R h2#{i($tH+29$7op-?GEvGS,nmĢ>>E*̮UQw2+%͖3X8=CƶF **2CC5h".'cm¹*2D//#&v/g!U[r:I P l5EZV cHZ^Zy`m?Nq f{=ejv5T}CBZ%j)uw_vXt*.wUqs%mSFqt]tK}wx3ݖoB]i4}pho'kҾ@/eVS7p-GOYy͵8᪻6¨4Q!v8D-qn lk*6wG"٥T)*e෹2]8R:N~ռ CH dP@ 4C}o"Aϳg#^?OOؑl(b&ˣKl9jRuP1P'WEJe6쑐KհCi,=/1利D}w0^'<^bU&.Q<-wtf7YOog_!`qu 8yWy -D*dJ< ``N9*=#3.aUgF %[nfwd%댦 Pݖ4BST6\e Xdu)̕g+Z҅ @;Mpma"s SSg, Ne)#3kفl̅LTjMP`Zj:^jRݪh;l:&oxk `l9ݹ4yh,¨.{^OIvC{]7zO1Z:K VҨ!=EԲIO?iTw9*ӭ^YUk`Z:[P[7%je yX^mN s0M-]i"yuOrsyø32KNS[<7ܖ:?GW6A6Jr P+`;DOTnxX(M= K0*Zw "hCЕKyp=:7Y"<G˶rH.߿jc;EP[`x5ɛKiԕJB@pAwertݍ7?R } GuNmnVCjiu#~s4aNuц>&c3@TCe~LYm".aL#MqH.tB]'2\HQ'mg5齿_.[Ht8\23IOzmG.:L"͞ZKy˒Use wu !:եEbzm'4xQYXov˴TvaHՃw^du5@`LT)0[kbYĪվN!w1#듏2:g|[hq!GXgVGyk,9/˛LU*#`.UۆA, ePU,3]lp]^Ҟ#BT6洝 C"M0O3jwi#Rtaq=`fؐ>MsJ )1ysӚ8laN=]Z%e{{eO6bզ(pgG{ϕ Wrzpky7Tu,ϒl~72yZ/LmҒ(vQ]h-~u/WL%kZ|2ɻzq#35釭q3 >P|>IS&f!z} wfU<mT@f#z*;]x2]a~Q w~FEr$ô;nF?bu Ù`ooSHt7#YCre&ͤFShM;31,TF0XbgpAx#y{ CLd'29yPswx68cf0x/b.<۩efCC`Sf0#49g0jZ>07QĮ2FmzP%73ʨ iʅ*EO8P=g(d.G03(t Y"BFu̾No[N10;^77eS{/Y({Eϯ~lH]IU>M|%I!WiBre)Joy'NO)yWq]G{5C1ˠ5S */TVݔIe,Ìn/xӶռ{)E RG>$uߟ(؀t*XčZlƉb7C>4n-C|r@Ů][Ƿ?Zߝ=>W?;:~r` $.]Iy0k:>ǝvT#8 \6gݥ(I@Qo> Noi;;NOȀ%omYR ~Ῠ; 'bp1ӱ8͝$?6, |:o T : ~ PԗP118 D$X "ew @"<}0AG~\9urI\.a?.0ٓ|i;|@G eW̄"z6OAn6|y5bIٍ"0||D~8J;z&0 Ctwfa>}DQ'An00^h`d#ul*B#Ǟ/'D_KFlNPxR`XU*ř5+g/k\ N#%YGWwBhyVc+UU:I`-r>M6kt!]>NI1:hʟܶ˼wLb \f{'؇90)`OЏfc M eBR2٢dMH_0L#eA/Ԗ>6Yagڽ }HD%>lWYfh`yU9Skh7ms(PcVP$L*5O\Q`|jâC1H慂ێH3@균,=#yŶ=3ᅦ(#vLDiBT9aJ6$aQvRMVy`$ AHִy40⾧}Hၑ|Hk^cE+hGp&rJ؜yb&Y YD5״s uXEtP]Ǿ:>Йb9\ǪC%H rVc>ZT:Ϻ瑰/=Ž>ߐ8~w1;0$rQg'욳J4n?aAg7RN wX9Y&O(mt0JdYՌTRCřk @DS5y;ه'/nϒd ?~&#'Hl ˯mtg6 ԕ͂`yur5وЅ~Af-\K*|2"]whHw7H'O87+W4{CgY63&1a1k6E-lEfKs5O4 9m;w7ZqVڬwq# ު^SWwu!ϲ:y-!? 8ңӗrzLP%ۤyFS8/_nӡmi>-[28yƎ{.𐽝˫mn8Һrz2WZ%Cw ټKh՟;oĿ}rjb5.oV!JޗG[pI6Nv+E78ß6^`[AP:xs~Ob![MTGmE=bSe?"k 0Km H nKEmҦ3~a/<~k_7 ;w'?P'|BL)eNʟ8|i;݉ifڧޓMjĊiO#UI؁PgPgB uF';N4B= uw $ہPowBI&?POB 6 v 4؝i{@hIhO#4ܓpBMBv'4zўF[]% JD%zE^"oZ<+ofLz XCΌ&wy7Yy|ʧyL+dzO ʋt@Y^a>ig-*'o_-)6AaiBTrU2_Re"1q)p4 봘$)/4 p%A`a}I7@싀 < 1OrBwO/,zQ zUSЌ}VΪ[dSg.`+v6Z8#W ۫a  n4ֻ;(Hli:mf3FeꟋiڑ&>0#Pm}}gj)MgwB}k6gRgsE`MױS&-)l.S Lppm'w76{ea6edquxqS(+!Pa5j#]*,[VT,bC뚏E-%w5Ӆ!bNfrP,fL_p#53i'\hZ9(P4Jd,:Z8i@N' kE%!X /G=(5=W3Oʽ*MЫ6|Ы}@m/B7P:6^~_GӲ64\CoI6L ZhbBg/6*MSDEpxݸP `4e!Pr|$̗vhl0Hu0Ip"O, \bD/vHe#e uU7+%v{ $ǒoO`yzOjrVKU1ܝ=e93umW|՗̅ ַ+G o<=r@z4I5:ܼV9ELz ح{]6&Tʶb2~\ɭ WFZuF%XHE\sZ\n2xؤyaa,I5n2$d_\l dr(Lv:o|S[ No;`(Cgcgt- ksnLJL-BmKLI̘->(OM!̱hiŖZy}{jxVAbگR8xVd]I=0킐w,i#L{fXڥ!>ae9# pMV(e?$mky9ׅW8Jh*f*ߧɭ,/S{sFt͔P6ȼ_Jlzp0c?r6}X r_-͊w=q&IgbaiVZjz)S@5f4}#j;z#G!3x_wT"ۏvwOjn?w/G:ӏlWcy4gp.}T7?^ 6=7'Quzk/)7l?oWwCiՠ'<njl` *N`tTtD]Wy |WA3-61$o/.G] Di|6[ۥ3;׿Ol9*X$Vf^˕?bMo {DP$ο8O.$ 9A)N:[aVmE .]K2x|4vL9h#L. )22Mx2 `{vw//f49j 5Y2= hJEIP1` xD=AvQP6ơQ4sVVh6"ƬoB(88c"^x~27e"j#|)Ԫ Gú_Xp=Rݹ=u{XTX8*=7+ՋDjˡQ]iqUH:m,8:8rIД, <:Jt/8pj2ek=?AkZ@i"j3l<$ɽZDKZ`SZ&])b8;7xnvޔ-,h?m{3hE%4i 4]YQٷ@81pknvQv2p-&jRma]cRZa-/hXIk.ZF!H⽼A}ς}3 T0VHIw[UD@~D.4W&ml)tLHc=v6?`Qyۣr;ÂX>hxB3l c7OZv)N,8vڍ@&oFSu謢`Ϣ,WN͵αkUP*zuF]hQ+FSd1$ / ,H-RI~I',8NN\jx"YWk:U4H[N~r_' DWmcU R-"e)]u~ϳQ_/x:ג!$tܛEj+ :/IA/]xdT9m)JdG:-ʔ~+B朂GՋDo :}/>ɝˊQ]=%Xݷn+?G'ZnZwy M0E]~G3"{kBi)lZPAI([,tR$2հ8G\ spnhF_ 9WNȒF$ iqu&xJd `|(Ɓp$FzMH;a|ƸCJo[GĢf޵!{TF QͣZ-$9v]Il9NOΣaH*{WsløPg:k/:h8Bg8\t98n BKz)&TI'JyYpܐpnInwwIƃݎo 'E2<`ZGHK|pT{,HNNxw@ZL,W;~9N>䨕 r¡T 1tY{qqFC.BrNꨋAa]c cW^[Pj|솚s=3f55er!G295/YptsΠxv] ^SEe/ ]G|'b9p ٛO^bumo!,B=)i=Ict5sLF&쉈r粆' !|^MmϤw'(ܮ61Do]cl(z7RA4&)tD'5S׫e*pw# [rU:%[/£dGtV:4W-I'uQ8F,|1NP9z9|m4ZPDVb)bna-&^Tuq,8N&gӗX~( J}\K;zU!@p^2՘->"ucS;Uڜ>&/MIPR+'ZԶc 6^R} e#*Qnj_Ptd֏f*=ꩀEw s VK-MC*}G;k6uBۧ4vOr?~Rzm ޥBжYP-)j#* 1H q "c>-sZގ;6yf4V!.̦r)/,h :ޢFLI7R$:e:@a&?qYpTl$﵁']^)VM¬{يmoF(:"\(4`{̲Ə!7.>lDn *@|AYp =Ǖ.gea!9wʞYiXl;tQg TFctc k%>}Fؙ˪u2R*'(=C(^}39TZpIUn1GڮYao/jqi{8gsvz\y1xEMqSƽH.$O=,ɞZprΈ[xNk-XqC UMNUiAim Y ul8Yp&_W˙ '|<7睤dU2idq/%lo('Xϝ2?jc U[]p&;Bkc[֫TA*TIVF\)L-û]­/tǚ/x&[Mn:1-,JL/je6.VQݰL3yk#&hJ= v̼fm2wyr왾MOy8ɓgzX~WxE_|((Nm9JPzeXBkE8:V ~VXxc0%TcA% Z:Ys%E:.$Ȉ:&ɹ\б@Gf'@R٣A%lbr"z4&0ncC2 nӗb9N,~e s!E->GZw{gop<7 }0MC/7O''kD3F8#u<\ "5cPM=j{_z3G\o]Y 96z>r Qڋp NɆOOO+ܨ*eh18 H1 / -!zsQGi/ )_E*\pD1yuPCc.p*&AS4zL 1$~?梘3b!KUS`0譁FEe%-WɅs띻ɻl!(.j9:fjn('Oٝ߁G3'K"&Z-"\L"KOɯR ʴdqn>={Țُ`vASE 9xcYrEQvjqZegOk&5qr2" t&ڴےNgew^( >|}è{l([fP昨'g:S^7 (΍y'vhEm{fGMӗxZ82 B ٴNOSBJ]V1xnWc:"h&o-6-.jHQFۻʂ&s&  8θ%&tKk3] pY7Œ0_ [(*ɳFoQ.TA X5v, 'FIJznmPn2W"rk+Kt]JpᎰඔ)!}QG&擎r9OƲ8yGZn N`RmPѺj5H:k8ݒߵm!S~˙[(e֐u mN$ 2^]YcNUgڈ<]Z 01OG]H: Fz)~*8p 6N`A}l,1xBQR=x{4hg )M)N5\G['‚Zn F1ȣWum("tfqXC@1Kġ TwW!E+Vd 8FZ7m,FZ,1Er s'82eɾ"8W&(0sg|G/?3F'['xoxYۊw'$=^]}Q ) U~aqx$|d&ss9 I3W˄f %{&Ia RM7)%n('O 4۴~Է~yYpq[+Bmwaܢ fSH%iv0Rr,;؎lJN";>ѸikmU_,]ڕBRq,9 W*a '{JW"D9XhUJI[kv}_AJ )c?qFMW2j)7:-qFiϊ+gU=d]^G4NqOfyxՊ=7e#_odWN>&5@~_ }F#hF[Үo⮫*>7hdhj #VvQsE OpIӃpx?NSW Fs..砉`லAE󿗅9YDc<A2_&Q.%$4Dzh u-ltUC6AJgJu A`0\7wj􇙚5 PU>g f#O~[K%77\=n0:ޥ 鎮n:

Du*c62^v-I7`h{G.8 Khr1ς^Pg72}8 }@%9bᘅz[v~"::/LڬID^=0;c#/Ypsӿˑ^ M/LKs̔?xܺ-UՖcNLJ+t9ySݙ]|xհo]a$\OP]qfUg|S&N̍8 mZE0--1+e\c44ӃktRbrq$ŕpu/la]^,7H/&~SJWlFI\.NյwZa1ϮE*O_S^T_T1DZtM.AL(f$DmzdQd| ^ PUB(\S ڝI|$?= Uȝs} 3 w3$Y~PYod=?|&+>rFFE3/@yq9#|X6%#0#*`H Ab~8zJ-#&8~PNuMd%Rxx@!Z~"@iGd<ܳi$ldmVpbm=ܠ'n6hYQ)绰kjm^Ֆ`-mb pk#[mcmҖKyԗ &m\ц1'FēX1MS-0Ԫ'PX0M;v j6m]CkO6yH )[JAIFHM-h0>c4QיEϬǬۍrSYu(AxDʾ$&qEOIړ _̓o}`<:ŨhOM!kB$B,XUt z=xŐA-yxK> gͫ.twgPjT+?D_=:pgvՠ@Xc" -RDjr4-eQs&׾ebv:)#)xw-`6tlmo^g@`&$;ĥqA}`qK )vyR7y8Y^L(DH-cHapGOjQkee-lMeb4YjagcwbfQ&%O6XAL-KMb=-^M~l=v()Gx ҖkDnVn쟅quY@ϕh x` i $!Z\,@)Hi*'KUXașYݭ̲R,Y 0JVu ΝT#)C7ׯoݰɞFDrߖ ކ `RA`ݢ# 3y Үȯ^v)^r>|MjΨj68 }d^r;&K˙s¬p&!h5 ,}1_w:l,ߐg-RK<wl}w[..]t=j@gٗiՀA /;QoZo_xd!뽲ubϳfgpf0E},M]Z}Lx<-ؒN?9g-{ _(Amˉ{IgX $ڴZ`g:xfx!_-1K[gf:KK<%BY11 t[ / Ym)% ozVv=Kx+,{[)x5ۏ˳W|' L$OasD*Q{ ﲔ.HwY0Il1 XX.4B1@}:FL`8V 24%¹lӊu٦?x0TKZz57gnceS~ƑIӈhSb-8ూީEKJ̩No'ϚVԨ5L jѴ5g#W=sSȤ>vQt<0`1U֥f*UO эL+sl迪c OD(WU=3s=^;[V)s]?q ÷?; ۡߝުs ^{܌:)<@۳@)ju $ @  oo!#D:(WDsN; B^~&z;{{XRbwywAO<]3r6\{,ns >W"p'FTwJqQ p}}{qYkNUjv`|74|&+B-(!t]N>M jɛ `[? `E@u"> MQ~6V+E:-.9Ncg!/[0=_u^6:}ڹSk`*Z)gtcmV'+Z]3X/k4;!&+AuJb_f! F)V;:|,gWH஄mÿ_Ï5Kik q[XO\wb-i[Xc~ڳdņ!>\Kj(7\:d>ѹf Z̳K㇥ s_%Kf^J"\&%~K|_eF&*KpS JA6N4> ` _/*@ujUv,*؅_Cµm, JF=_^MӘҬЕᱳקZ/cdJ&T2\b8K3gcKUHӖkPpA†蕧) fm fMANٖ|ud%pא"\dMo,%N G!ZcdMSTHe? 1o$,A7nM )| nQ̐jQV{ -MMe>]R21U )iʉO02֩#n4!)v2! b>ʱhXO*d7vct\O`&cT㚢tx˓Jy X Ɣ J0XD4I'0ER 5k xJHO ,gRlXZPP $O%ʝƠF8 SCSJGNAH( CT+Q Pq씀nR/WU|i_0M ƒ4M g1򹘀 b 4:Z2el/WɽMkun#ք4[~3Z *n qV?.aBg|'S=yxa\|F6amLcN[)8BsM)NuI8" 2  rdsK_L8MDD4<4x0Ł2f"IV$BeWr4Q ]!F1G`Ee1ձCFk)38lNakGE[q'bSd[ԷRFGN2L@8 V,j NEj!7qyeDͅ`%ׁC8-%癶))8X*x0Ŋ.\ ZRY)Pnދ}B5uC^6n6,X?rUÛ>jYQiQ#2&/+`T x ]j˜薉HSN5}U:k X )UħJdJV gKDޜϽ# ˱ &C/!i:$ [5p<,0W V* @(#AI K3Q\<^iA<"i<;=Z9{zu;kS5"{m@M30iòJSkLs${@H ?3%9f%p#bU$V8U&!tN"%R.SJĊDJeD,:Z(c&)wdbI#YFRaqB`)YE >OƔC"X!i"%)J*uӇ EdS)~d*36M_#*=% % ϛ4xG`x "oo\4]=lK0LLe j {$lH0vɵ nS5U=jы .jG& (*1*v1@ɘfIN(sƙ;ޜ:su:Irl‚a kV2,b !0K]$Q2ba$Ye$VXCyGAozޏ\ys)0Wh)+evǎsr(mHւZHPs듢)%$}ya+}T"(KE@rm0A3Ij8-à(9#xtg>3B$jAJ#U2P,2i>!-`/j7aIn<;я\f#TX` Il}LMjMNmc- A.6iBkkRk{3{O!=* iB/l~E$*͂X*sm=)ʭ>zӂj#R0 p!`ZJ׌ǯZYBRCᨡ\7{EOz% 4 /S=f)T w:J9-"u,/zӺ)bM> Z,Z5#iC܇L݅"~g"f}{Q+W•1J9i(d"I#WB/u ~چk]yP VoE{-PRe5hw<;}*jc ͟k;B58ogMe={}J ޸ߙ:It8_XX)+us{WtOsh*u8MfQ֍Gݫ]ֈUƖC!!gMJIܯ~<c%maAvַnz;Wumk2V+5؀twrӫ딽6F G 7kʊz`9|ɀMgh^]|&⾤6B~{F"ɕK>tM:W}80FH%ߥEBW \tYeI  j}U7 %y]oǚ^KtGnÕe8M@wsCuN[ھmV}^M6׾(f2XwWL sҐJ~n̹x< ]2g=_u.#L-F_zy1S7'PBן&n|خk7dEUz_2_(P׍ÿٔAFE/ZEI")y0I=S/I1N㸦T6T ܡtӮQB^ o>͊eyӜsS`i{ӆR#ijUͫ^F,`qg|ge5W)Ē= ɿgҌ~UmrR)&&o0&xO1rU`m!3;XL@na@rbP|zvsWyڈ {tpśdwJ'Il~qƃ <vj0sF` N&+A %Kٻ9gD^ /e]O,`Km2Aa7_Ֆo+SWlֽ@Rfй1'6WN/REEC'7“ t3ΒcvOw91d4; uIw;u3wh˯}9hѥk)q>Iuοs7_m)?+a~ kwbv|~Δ_Ǡn7~t|L3<΢qsK ¡W_▁)v]A^ siV 9:AL޺(gřƂᄑa <}_[sNr(V*_ޟ/I4{CRFů2 >}>c: [J-|'stky?OFx 5GqݏnF7wף[0Axt@~t1|t\X\7pP!T ӂYe5eOɹxpraQ4ȰX챴Rtƹ[*r\&:%ux~مwR=<_!F:L;ږK|+r?ſ^7$D[X&`3L~ %ʚ7%\ڣ!UmŋLp7$~yfPk)燍U|;M "auT62r G4(;=|4Py{q(*[X %/s$[>^Qm:X0dt;Ws.(X\+E}YLWjW죖eQmL[ocU|,_p'{ŏ]s{8O?*JCÇLͣ9aGjiF~= b^rx>HC*%X+Hnx.姱dmujGoRȀL d0&5,¤h@Sժr h7a5EAH wI!A}iE%yvS閛kGƣ Y~o1kZ=,0=T>t/*Ⱥ1 8=LJ"UN^.XJ;oAdң\BmtvTKiQozB<آH#騩ֶ - ZJJj>!\7\k$1j3_l*|UUĴ{- ˩[ԔJup5mQwn*vZHFNO=0ҳ",K:">k=Ք9#MSᮚ-?K5xxUzH:Ni يV^P3@xpK޾TFߣx}{^/( 46u|d+̨eF}3sIU&ZPHɨMy(23zMmzhY<L>ܠqjڗU4[Ԧ)ǫ@7N'֬5AP$Oo+mYHT%I/Cs\[}>lkէoz?Ǔ6||w\|$" Y irh1A ^2 Plm+$%4\76]˸cr|s.'7UPFіo &P%H.Ym^/󥿽#i6g?g3g|ӿ56 {ºT!֪5D\qm <ƺ%]Uk! DCeu^ F#ġ6` @;lLG bOX5|w$mGh5zaq0H2S6b$ɣu>#-^om+(q?̞oXT/]ܚTmZ5WXhvUӚ/s\'2NE'qPc!V[@5n$PagnD~qSR|jNoIڤUGFW~Jg̉=q.uͮj@չZжf2PDh(h 7"U:c UgKj@*lWUqr*GS&e 9H6p _ kֵKh8ehz ٵg,qZ> (֎qj i5 (qvflٰ++a̸ߘDNi+Z} D|Ǎd#dTUY7u\,n5+Ekj 1|>YshZu%Ù ӖzFR FdCKRq$=%8eRUuJIEe^A&#Fa%8T FT_RqJ*2݀YYq#ـcx&>TU=]@MܹA5iB3 JRqCܵ6@I*5ӆTR>)i0n${J`$iPwT"}AҠp#Ţ/lTʂ\ 56^b!_[M kC`VAeWVÁ)#d (> Ƥq#6\"ܩ(.k#/T_֗LVĢ~[bRĸ|7E7i1͙_S},As^mE;T HMJ`IlX KFXuT%Ӳd9,Ktqͱ j(ԺfM0wCX3uUyJ<{8nI5 ʽH Bܡ(W_ڛ/^dF3`R; F —"—U:47b\_ƍdk—"k8—&4W2ƾt85ojTzr6h7ji,?__~X\F1ۿR-dVWwSYZ,p.ؖS}o닅g|<]Ȓav /e&ꑋ[\oGj?[62_??8]Һ|̰u҈n  f1M:9]l<[$j-szE:t]0[4'g *PQFd(jnՔ}>`Gx׊yKF[E/_ k%gp{O>V7˲}]4 Ją|##6S(K ^jx}ĀIh!!GcSp3n$x@*#as›C% +-Blڤ|θ|x#0`¹wV@ӑp9<^p0jf L`l 8$ m~h\_Iɯ |42G FB=Ƣ($Q#g242](2Ӕ$:[aGlu-0 &cH6Ҫp +^NO=jQ_q{#E)#{HN$8־K" )i 5oX8ZWE}k ^ /m[TUY`mJMhW"]k:* enYj@)7偤jH>y!MSt4D2z֞q#ٰC r!5YU!M$`t!6W`@0j$XExK;W{|PY:_:NQ;WVV IF9Z|!Xmwx2360h"gk ur嶲{넃;%2.7GWHix9?^(5f#H9yPCWF KS=g_wǞ>ECݮ6*֞ %`$h0fSlZq@ A2 FdrūxƫhlKgbɡaYf0};ڤ <Bh 4^ =ͼ }hDE 5j$hԸo*Ԋ)>B]x'j9}m2S1n$@0hUr%*U)S @܄2.w'b e!oycKBF-k}FcU%&y@ؒQ lS(H66):jg*,4V; B}3NBȨ|_.~Ȱh\yIP~np_2&V* "Fr\L ڊA_C(~~ҶC1W?is.7B,q e6u]UBķuKv2*&ۧ)MBը|PT=EFduUBΉ-S 9nqe5N ލ7wM2 (_M #r{'1P$h.7/1r_Ne{}WQXC:QQ:HBըlP%>cmi+d*U<BfqW@u6!9JKnI4$#9Zl,^?Gv>GUPFdV t灒O!a@P`[}F $`$ oκ ƻHx[i$)e-aJ|`$ {%,4O~hŽ`bhe8,FrQx5(>U,݋zIKL-CcV#*njL2[`U|%X3j#>z/F q0 CǍd#i0ftО\.{m{ݼJu!m "[5ҝ^V^l9;!Vʛ{ Nb9" &K2 Wts\Tgl3\I蟦J,Zqroպ$#oo]͡-sNޟL2:MLZbNd#h#z>*B<`F@%1eF>Ha`sH팡R?dBv+%J)8L0EX)wRN2BB˜"q#HIuy%R<~~k?SDD-A7ާbO _][KIsӉP5;ze;2?=1[DȯvߪE`ja-?}]?ߪ&GjyX_v9k)jD$5XMcMAn.~Ep L̑Pyz ce^ޱ3*[{I}@N?c۷akMߨo ~_gS{.%y?OhGUs^=0{YSx#S<~,7 4.ͱcʻ4zJyGG1P(u@ܷ(viD;%V:} fOXK滼1?EhMANhyE7RJhaulp6ʼ2k>cws9˒O?| ZU?n~w`_lno/a=cX| n~;)3X\,N ^L/[X%dV%Q+GJͥ~5;MNz*XA{g1*cVɾ{LU}vȲ8a,cgZ XF3;m7#qҠa荽c+_Wz {[c Z@Id)ƥڸTjcoզh1[:Nsd xeR|twD#,A92ߺSe-c\,?__~X\F 2ۿr ՚2orcD|-X.?Etqz /OW zH}fǙϪ՚I,u%1w+wsVg厍%g,?.o>?1N[&noleDXNAwOֿ^su..{>ޤ=}?dîzdm*:"Krn,9JV[Yl}U-3'D w1ݜimi'4ٻ6r$0sm4ཻM|ٝ fw6f'ؖO_GaWKN#GcU,VMb{;̟o`;6 y|'SoZN6F8Bڢ^e۳"MPL2#0ʅ9O8)6k'@Hy;[ ܤ |q~p?H$%]_|_^6\%R$ӵ^ZQ?O]舗\xO;ɰ-m[){6nAvFuԢO!B+NmI8˫2ݳПSE_V ?Df.RPK}bj .49[J~0@PQny|"]F4GqQ)~~] + 5}.d+_^bÅf 7vMgMUB;{;v_"MAɯ˗IZD? n}S׏q%Z{VTJAmeIq \4m֔ .Y5`+SJ*5ǝq/d? J~ϯᙟzr7f~uqK麳=SΜo2`ǣit_\]T±D!Ͼ\uoMw"n赛ͻul?w]wa?"LQiZթݨaKBq N:~M̷tӎwPt=$4Kt+5XR^䉬'@pµXj.۲+*A_|}Ȇ6R` 4Gةz~c- nqĭ4P} $@6FgNz" ':!wт8}ᔑ^QԷKU.sdڱMֺdu6qx.& rdi^RAQZc$)H ZH-4qY92ɧl[K>%5K> %VLjWd@d#i5JU?|'VؗU}JSt>~xχx?<6iӡW1"~?zTڞ̂O]A盷fПȟpqʚI0gS)M`I5 [4ł!Z.6hg"Ym5ъp T&_?#jKz!1Pp)aέ*0kBS`2zA4k0i&y¹f@2KD; M8 gJ1Ni4iN+&K'vw?Ÿ&jaq=k )SB4%f0qN攞rq.DT4vf]+D\H~~ 0vX<5"Nh =>⻕$˜M`j⢕ˢz`Ѻ_Z _^Cpz(Nˇ$y#Xcߟ l"o 1ݸ2ml+K*kba*cTE_;j6:4]K0*agtrp8πw\o9GBB`7qR ixTGQJ(@QGU_em?T_ /cXLƘ c>BIny.1=7u%֯5ktMU kQ_\ԔX-d$iSS@ #>Xpu0TczjtdedStL:kQ g=EL7S|!y7 ҍ:P4\UkP\ʘ,5$%ڕwJjvKSF/TeÖ*n6:jֱj&΄YJfRR*f L&Di%4^.n!bhklnRFͧ?>wB *:G#GTԶ#yBu$GijނV[zTJov|Ko!!0e.&Fi-~ VKM[z-ՖjKoRD5d gY.5ӺA( )mB@1SʶW+ !&M5sAyzP3I0"Q 5A!ƃB%O+C_І%7~? 0ކwl҆RJnNsoWPW.'NsnSN8Xtx7e`pּjPnW+1 /A[_,G O9s~ӹ#𿾳ugPs:=SgU\NVg!L]=egWc*Ŀ>ztr{Ew=뾤ʊU"= ;E7^'54@8V'i2zP)^ `7U &r?WX6x4H -prlT2/iSo4z ێ_I5OpMʩS:Fkxӎ)jɠi'-=fpԴ3XD\JMKVU v4|tL fL=qFIeĘB4E[)0V` sIkt mՌU!VM6֌U!VMgQ5cը$ʘ (1j~;CPp/9Iк)!Mb`Ѵ'4lk(JiMs]h`۽PR$(h44?.UBm[$%_^JH~#kRh_#o#iTjb/Jd1!퉤kv%F6ĚXX6|Z$ Q &`1i$Ȟ.~ -gUoYudDNΰ2ӄ,$F)!{D-?;eru4qO0= 9#fLoX}Zs6d>Ebs3l1Miex >z4=xwߕϊ/eZG_ޝ~RlKEk wP)MvNcf y})0ٸחX6դY/-l7vfoWPk8DkNDjCn%ZW(QQ GR6dS`iO$XCfO&5pQ lDHg;jN~d5[$`(>c!q5"Q]>T҅諣;7pQ*_ShB%@W܇h0vj M]!i/נx?!swk$Z 3$}ϥ&L+2L1U=?>K!)TuCʼnBg9 xԳw+M:&/Cjfu;? ṟLeʝ _w=e) ;D\ ΓmWc;ԔJ𝤠n2{a˞ Kg6Z0/f@ rF*}vBp̆`rIw+.vf+%I,yN ]&;eUT[XX1 R4S)% `Ws1j% kkNq<ղJ)q؊P$O6#SHAikxW4; HCklڞt#@qW9@菇w_fB~0ٚcDsӍ+ haklrN‘s"tDJ Z}`RT8 qL\MIؔ2,Tk+S<~{wT'^ufm?џF^U=^|asR!3-ΆX]xV@N3ᒑC$8) -HqJk\hw~--{/8dGxj6;?=)\n%-,8B 27"S++͕fg2`CM0ZHZ鮭Lٌ&S"#6'hۨB,˩-^/ɥu>SA czaoxs|ArTbϩ+ʢ8[ KmA2PsmBDMஹ'Z5j3 {e0ȍ*9"Y#22*DSR=KE!3ʤ* !`zR+nn=!18&Q} ]!3vcAf1 AٰѢ38p.H+>Tfn8Mz[' Bf&@¶K Q(f+㔡6PK! *d#N7@PRyhUټ(*)Jbp틠Z0HB$b R'H#÷${;ŏM&Zi:MZ!K '='J9'!@UY%ͭPBģ)qrmR0I!%\a Jaq FQtXqjc:~Fߢ)ő,vH} ʈ}%2sjhRD$dBȧ mվ ` ^ZeE4!)/SMJI352I5OU::U\hK<0eR11i5eG -b'X'(5/7D CH&*,F2цb^5q0^+DQ.Kc{M{YJhfH! ّ ]( %z-&2FxFH+w5ly@IqsdPQ6Am>Zh-SUKYi0 K5 p§(RyKW Hmri1:6V&ܪw@l%klPV`4$)8Y^G-XO}0"^FU9BHJƬB61{ODaLfL0Rnj]PP*BX Ekt`3 F,#'2>D{(MTD!A% NHU<%e;(!ՠPw|DdQ ;4@OB"2 RUYd%  yԭ3 ƒ cL2غ3Br[ת sF=F<dm(M6Ob<)PT8xi7-J7#ct[5$]p*f8-RUZPb蘈($;wŗ,JhPJ$suCx$ Nx).ѫJP$dw$4xaPM v""}7Xf!GNϻ? dHIm|&PgLB:Ð!A]R@A`./mɸ#X<ˌ4&(}l1k6X*Zyput +RtP"ڌjF$+Tg/$j`nA PHL@GF̱s, CJ3 J)^ P?AjDlPQ,K'`Wf8#h3/H&H!(r"EecC`i!gѝhG4F52 o%JPQd*ˣHx[T`!-@MpK$ G}#NKHy0$c6N`5Wj5nq>0lwč1r:IkU2qcEP]!d;Ef}!\ۑ%,:Z-V:c)8w)+; ގ=@y?v!-*3lvFjj7 JȀɪ etGdsCܡܰ)\ּA,\< SA@ L= R(XYZ& OzU`?r9y!(*bژ@$\R; 9m$R~FDym%èp<`E5H.|uT)FlY+ ;iT,K~xEB$=j hNeb\+F'bR+O %j^TM䃡,g&IS'Ĥ '#TCxP?wY)|sP&̌eeՕo2J Ѧ)p{^ kJ @Ph(fP(-k5Ev(B}~)AH@w8Z{2<B%aYwR0-_v7b1g}uju81PըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:oƨpٕ}6)+AKi{j7F{H)ΗfW?FX.ԿHvs7|X˳́oK3l*VRPzV@h9`Rʁ+aN #, @2*-S2+@J a@Pj55: ~ژe{e#Ę-OX @r,'+r(`%UL 0CYSVP eӂ{`Y2@2@rf  زNʁpVqbX)Ck8Y[6x9m(+ K v `Q-|aC j `%\ gá. ;4` 7^kiD؁ ,S` ĪkRP-21'~?X-0C #:`6 Ԟ`;"l `Pȷ ]X7a0Kyb:+2F?9Gb2J}La[F\L/'ܗ4gW\*^eB1C,WlPJ|⧋U:Vk@hK4Lf?S: /adxZ%Os&Pn7q&z [o呏8JWpJK/ ]P-{RӕwD NNW/\?1]'ϗ܇썝x_@Wիoalnz.f.U\!,>uj/f:\<L0cm:/%?8gGٺgZkmGXn>ŕ>n_C[{=:p1"߿c/jo-_bv~5Rw>jwzJ ;WuS.YoN'߉yo$P/w>$xp,tx0bۗ8mBn.╺x2w\現|q޹/..9ҍ/b2/p_e5lzZbv=/ "-ѲݻB#oڴ .Sq'J%fM~olh9U\m‘G~M'?ukQeŇmcdvAmu+@9I9$ |Fkvu.Iifﶕ+ᗟk ccM09l9Sj';!b%#Fa#νZi٬PLιo.r L%LHak՞Wh S|ˆ}Cmv %D"֤"s\&Qm&tU١>Xlӂ]uZԒ>X{hd ܁ጦ2\gho,(L (ALL]2{ADeuH|4ѫD /M뀟R~\[NvA顣5Z$z-wˉ_b\pv~vBR!|۲~ it]KhoC0kK}܎|b\Ռ4.̺5B݃6gF^zHU2]}+5N#TQAElv(Js@sc g.SsmNmP0a~4;0t%Wk:![+9<äFBlcm7 㱛h2 WiEueGhj3@2e 倡 @:c4s;{511x\:!gBykޅr$+ryW wdnbDZ߀;ü!?\wm 9h5G(B*8)tk f[[hCw]ES7>=MXN4KߑcPp<Q6=.םz\N*iRj__W"!W*cM+^:Qzw3&ҫ.Hya)V\[|wj8QC1ˎ b)Ǐ}Bscp!KVG6-?u}@Yn~}^.w]&?gXw#o.<@!"7Ӕ-kR4ZޞKAf5^wh9n %6h>iSFEL1VzcbnweH x] fi .uɲG*`2uRRtх*; /3g@_TCVu>:.of ~fy-aj6U^2yM|:Td$&{?%~cŦL hWW>çCm|X/vl~]|]>._!5ދNLqljX[)xuG;[>f=ߕ~'R39ت?蓿ܮ(n6ңb~~=?7]lU4MvK~yW7Ji@{gX>Z;dcֹ zw3FxF!ۿ[}o9gPuof٣>͛2oƳu!fY#Teio;ܹDTsglD\6QMW.C:4{< J ! uHYd޿GKPk\0چ/zyON9u,Hə*.Be2j$FY$:`鱶T X~0SwsyXu8QxyvqiJFLy0TJ O(6L3#S3\ 3/3ݒ|\ ZO5 1!-Ց{͎f,@OWGPA4KD/KD/KĺD8&|SF%p޲KVlcĤp *55 F&IG3,'Ӌc9бO~,Ρ\<;OGayqο~.;x,w^ڻ=f%v_ |n_MTL::BGbkm&8(< JXDTk9()ǎဝN뀽ŗ| A(czfl 8 pISG'S%f҆ӉZ<ѻá@Uh_>\Oڷ kaDFaV.pPMI"m_JfN:c..8>='1ŇD5RHZXH$.':Lk\nd65JOO4}4 ^f!zcx 5!,E*n֮}ovl2^,?.+جnV>c38ۅ~U/(/<_+ǫ8,mX}(%Eф rO=1 Zӏ`֪˯q}7Yz2A )NcFX"UBW:IC0J?J?h8C TO8I#jHaU͈M>ssl_Rt[CaL$$Q/ Muk'ia"3{5CZ7 #f>9(72mU- ̌(Ze3TkuȨGӈȇ6} ނ%SH20%5{M Qt̷Ѐ "]3U9bQQ!.(GNp$d(3q߫ҺfP5y[aRU?ƙF\ 6*]kQz/0h Yl;k 7 kqȤqG $ Mj:c1uC0:ugA_U.qXN`7[\Ʊ08:U~UU3US98Ni)X '#"E Ɖ/1tƽV3uS jz2м\}4'W0V(X<`+?ST劌MocpLS<!_ XHÉgZ @&Uy*}߀[("Ȩ[ID*#{BZ6t #[IԘM@YrT۔MfVS=̊VӼIm!xdI8'lTxqnVv2j4+ *Ep&ᘡ}伸e"HH"\$;doF>PzDUm ! )KcCPdS(ƪ(GHk!0|>C6 #]&HP&w,))e ;5֒ FY ѝ iP3|و' ޣ*Q2(L rKȲ;EL*a!M ^Ba:1߂s$ôqj !و D]/]s@.!MJGΛ'_P͜L?_UϳIEͷҾe%/>WsX/ΥdBtƿeJO:d*hD>Tb[tW9Qxyu/1 ru'"gF.+&kp5a@fXD>PbL?np(>p8JxmPpkVhM=T}F۶!rmDܹmuͻ[ߊm3f N91bFD f{Z7aŢFdg]Q6[r> de6yBT#^1B.Aeϸ\YPKʈdgxeV9z8e^lh7⻃TCF:U?_z #${1#P匎gJ׳l=드*[phF of[Va^u ^8w,!̄iC\(pxK(NGÐ'yKUyI##VfCD(أStj@CU.#uq4ي;Q%[U6u6wf߸ 2JFbE=E@:d|F߯+((9^S*xf\X;t)GB4`],qӈ:d7<7AUFw3FFI YjıPQdY*9hz4!C˧^ļ31"H<ϜjQOhUmx }i&P-S߱*zJ6<A%.=]]6Hݵ^S)uR[E0>1rPCf`ۈr*`uѦPK+iL;NR ń,rXмqm-V{mm"ʓ ;{Cx4T#r9C !(&xatCFW tżzPŮ_f@e+\J쑑#̄ 4^uM'D>Tu@c!jïr"F !vE M6D>P5wAڵ[`y6؁$w3ZH-WlĢrUZUH\<꽱,zV H\Xu 謺V`B6o(ZyPӽl=΁C2JIFK;:UU2ΛɛFZ{)5H?a3"\ƭ%9|Z=21fK)EԠvlhfPhr%nz°_=n uR#J&261 Qv~h'<|'DOG t CFa8$ r ty^!YL6}r~Ô(⪺tJȻUIcBPڔx4M |ʅJ( @!OrSV?2EW%RE /u;݃ȩweW 33jM~@ZJTo;LJ]#JN vR D^V@зx|:׻4tin{1{TU~h{WK+T>Bk !ąxڑB#NGy!nL? ~O8:P]7Mb`h*z@]C.Ĩ[U8_{ [Bc"z nӭGv쳍v ^/k|{,\n8ROQۻ 7( B`_©| /P+ըɼFPҫFz@ezUD~zak_L65n |wZZ%|PVSǑԆeEl2 焍 /C9)pGޯ?C=m7##xv\,ʰvGL&u AºgY{䶑+ cΈDds6-(錦-gbWԫuF#vmJU*XECT9! Bd»T  Kx{ԧ6>]8sOG|oSU14 uԦ7 *ey$_c!a ]Fjc7r"v,G'Z[S 3?-,_/ 6sX8~|o W26Dܭn7 ߿LGw5o+ooK]~0B(Z :4`^8wļI?ڤV饝%ЯTec6*WKfVR9~ d.a`0ѝۤ'P]͇CaMGRʊƉc|??$3 mNk="%G 'On]OrYZ(r|jL^-[̈́)ÉǟZoZ*/*L58>ow/5 mmG!0M`21` *aW =:Ra(bGkԍ5Uq d=` y`e_R)TKh-6\&l zȩlcI`nw_txPC~+[d/ w*k\NOlr|ǿn@vkGiaװ_o^vUPW;1z8k,yvh:7٨ԔFmދ G1@.~&%yUviӍQ2<!@$lţ=#;ȭHL܂~ұ3˓>I2,TmOo50T+8W6Js 7xCQ@}1{.?ȅUyEl7kTNŏ>2yAu//W A4(Xj7wa@tʁ <g?wͿNfi0yY,`!Cn6"51oz,8w@@b#43Wดcȣva~`G9W9b8"|74"ʼn` 7V4Gkס}j> ((&p'Ԭ`:P։EW%nUbDMҀ9V~~ݫ$ʧ=MqX#|$O0JiK $| !ެ:yy@Pۅ<`yr/M $p,wvl1ܻ pl^ZMxK_|pat:ՒzKGA 8(HFCZ;~S`'4#pYrWƜ@u8ꌍx>xL=qF\,H⹇+ ]ˈcqsb}~tgO2YU턎DLX:#49fk`p(礃F8ڪ>匏U6~܍),Pp_bH9F8!8|,E U!K?L?xb80U#[U!v5:ۅZը,A`j;>(𡼗|2ãH A Qr@{xnFEv&;.~yFJWT Lm$X6>( C"- PYxqpXko䄡#W( X.F\Y,#썍xmgxr C0":̉mcSƮGe{=1CY>f +-g-|~`Kdy116!9)[}0S`cBf8gౄ1&4BQePEE8[# |([Me^8UXx-WyzG=g3d dusuAc';u^dNѲ!ׇgdA}f>:S @aoy;C;{=lHR$\hM<<&k#I-\]vIAh (,.'Hox:h7 Xc.o|L69Rq\*pļڞ(aܥΈQ=71P EωNӚqM>eGz8Zo4lvP6 |(˅n7pr5?$5iO]>rޣDDR9e#~DEo3 l+o)sgb3P*)З>e.;6V4v2=ŀ`:1ǸG&|2MT1;x4*drۅAeLwm@EwИq(1aHCU8gG]擮)vx}TϙΎ`L{Acxԛ0/YnbO 0]VJ)ckQrP{f_6-dcJ'Y:.Sa66G3Fns9qǞrN{#Q3W)?"`l1Ц=bbS׉|eZ4Hv$ |( N=& :! b sQ?ũX0lѡ!7\8O9M9dBh22\q6+s!nj~n:4KWx&0 sʸg)O]&#楶 T>2hZHLg`OS8 <*?/?QC*3-zT#`9'E`JyxBIEf9G8,`h EB%Q8yLί!:Vg)69}E(Y"Xܧ9_t+P  plllG `>F8*>r fxtШYb9| ň~6_\YKЬIPmNoB!(kťj"V$B0ׄS ¼מ(|#]uCbe$K\9!s59@"mh)Wͼ0!v8g\ J>UQ"~oAhl뎯W}3g"×yVx}IuƦs,S9%NGxh/=YUc~ Xda9C]u*2OasH(SX.9Ah_rR6!G`R_žr6bXsG3oe`='ޭhT 2b\)]#AWXZ--gV2Ϋ|kA3Vk]‹+|È1(!ґDk\ޡe~\~eލIV1a`b:I BDL ;C0% O=Hmfsn~z죯ϐ&K9/)L= DAU|rBeHj2;S:lB9ٴlݛ6u.4 ozatRW >\u'Bn+Mrpo x*/l\[&K.*x+la?oͪ,υ~Z9WJ\?6]Y})ZVm7k?osoLTJ1duTZ8Z]pjWw&R>ě%^꿫d (x0QIZbȢtI ;x#R#GpO5l2S{`m`'BRUn;>w?V>iJz8Q]<=smmu}ibRYD`-A p*ݗFRic,Rۄ%AcLM"ɭ:vJ4Q]<;s2.'N6UԣD1ZЕT1zkUt%&_Y2{ylsMq^ +G_ϯ^%tv]0^eV%[wTbI~CK$Hh9Whyj\) ,_/rSBνȹ¤oF]M?>~,U ~S|]u p߾%?e&Kb*9&jy\Z@_6!n1J~of%ޮJ_)v=Q8 zo϶IESODetˇj$Տ>j/5OۅC=a?I%ifn;wZ^#,8j8xZw*|ד+X$ UU#S֫o뼙J=w8-kY2^U_ihx6`_9ٝb>a ]͹c63KE]Hcko#?oSX0 o"^˂ŧ/@O/O>9#)l'F'Q7-_z뤟z4- q LVe _G0x( "vpBd 8? UxC O"<p: њz(FQ1z (j}!oa1XDet  MSM|AȼO}pˤd E{M6g#0j|blZS~jOyZm2|Ͽ̨Pb-D/et<XЌU<Hz`12mhtBm(*^b-Ec*gLG C%(}n{G3QA-]U~y{+u et.GuܜG@#ZSg{ jwelC+ՕNHĭuذY7h2ԝ!iGjU ~i@E Jt#7^ޘuݸ߀ VQHLCeI)TH/J -9&i# ΪBV9c3bXbS%HQ+ k+(FRDhŴ-iEXm4U!>f#/KQ0kYQ0 "cA=Dz."VJ,G=tpR>U玛 RFX>s53%~"@V70:<\"0I .h~eW͘ =Qa>c?F'=Xs Ao5a%vYtw(uLq{o6"S>O!MfkXnyYWhcۚD'Ng(G uTϭ_>8fQ @A‰o#Pn0tay^'oi bXVsihd6mG"kHUԂ/r; 2!B== |_I'I V+8+E:IPaEϥS a8Xao?,;ʂ%g1F"*d 4\(xn]҄sQb|GU)`RʼnpMtMLx\L ؽfX;i k-My.o5$x ډ'^Mj25ꊌпYKչbso)`OXf{^F!J&#u1 LKTGnu6?>[Ϩwܽ.66cR4 FC-% )쬾Ϋ:3I6?N?b *Xk_^KgiP-*P\`BpIb(=e 3:3c%Klg@J Tl7#݊ɴz3zz[ _V/w8TPث[M}q5{tVYҭJzao9`uFU#7c^3\-KR<|5p> Y8Ǚ *$נ#:DE`Qc6F`N:oo/\F)Â/,oLkY"f2kl8[cH.)I45*ZfUPi1l2kTaLFS*1S}`YOBk":53\V~1 qмq5vhpdGu8=23euӷB"TVYMl)HfOx%Sc OSç{(0]hö-yn; ϶f\ |`ӄ[+ B2P_H9+ K"CFBtnQAifzBsZ Jhom["\_ ϸ4/_u%kj'A*|nvB JNv ۬|y9,+el \QT2?ަ Pq7xicAxUJMkRE+ ]8#55Slwassk/wr[ʁ|lcβ'{/xFF̻J; wQGy^z *JT"7](AQdfeE,Me\IzVM!Qa3Ə_ EvJܙt9O mVo "Y#w4vnكY}+%EdTT)H!'QnEYyvpڟ~'k# @Ѵ9}~SzŌ:" v,3'< Ɂ=ʢ -/ %IiSSi2v:uJU/eyf=/N|g >'l/+l[eX߯):3" dn&m1m{H:mR/Ӵ-53:8~(Ł2x9`Xތà GeYt:1-)wj%+A4:& +G2ź}2TQu˨*j컑UѠojm-Fʷ߳a#URm, 2#0-}nv]uE~BX;jm%k_jaPc>!5G_>ɛ[j1>1zI/>~?|}N =(|}y%>f/{ÇތOW?~A>̕ÞcuJ{HِmR4k}نKʝ̨-0v}R~\a;Ui.YzrBc.ZPx;,3`s1ܱĵkM>o j&H?6͍s'g̭ܸ%Ǎ-3kmP =6)qW{gޞO(EYv{dgQޝg '@dKKGpÖI=d|'VA70Pbue/l9g4ϟ5e-t>7V,nufGk휓^qVwie}MM_.(}ya^Yp'q̹[Gqqo᷋iߔ`h+ђ ir6J_C9 Mw5-ȖWR;%K]e7Ըo(.8)Xޕ|5*(hZp-ϗdJ/·X[ʬa8ު1 p,;Q]]/pHr8/+_?\)R7e5턗ͫk}S;e6voyS,TĨ5'Uҋb^2+XQ\J?0 q|S)N_~)tDc,Wa9̴ҸתRdG_IB|"+* aR`RR+RL"kk|b'd•!l煁T s-o Vҽ( I,,Еl--o7䒖+:z0}A`1IQfpX rmB]n#5|uq S ՊR#9UQʡBGW iy3q)м0d-Vt;_TN`l4i*f/ސYbcZR^B(  Ti'ӊ` rH-Km/P_Fc\I>A*2+ XJ0Xrld"kl[#a k~Xyu}JGתQd7GwbeE i DQŴr3sDq~E%S,dP8o!&%c:+n-&y)NBI|*ŠB(*hys42)NRA*) Ҡr#X&5T  0EApʼnWlŘ^(1MF̳qgb *PqX$ywf0jmo{%oǼb(8RobD/*LH [HJ Z=2N/hgXkf7 9G㺠OaCXy .RF?WI``^@}iJ!Y``[Oɉ՞U/Du:'$e{ US+$ϘׄJlkjs^[,/h#iIN12&O#UJHIݽvxضh J7Zrf,+P~v#Frf83$gട4|lVJ,(Zo1_Jݧ>n^HxYZ"-ZK}ZaaXV@WF~>h-?<'^g%ޤРLZ a &|.F|t@Lj.!l+66g~~I뗝t'^MT97 M6MT0w~+eIj\yF p2=B˼&ڋXL1{VZ67tSDchݜr]O ֹyBz~08pqh@B#,qLBͨE*@\(> =Oљ7/ LnJ&3GjL[ V|/Ey8$Qn]hITj'YFssS'8s{#30* u8w"zIͱʃ/ykz.o&M8p?J ʆYW)M,+g&-%MD{]E^|6WU}B&8o_Q@֘pzV~~*VxMK0omZh6?YQQ՜ x@WԜWgFa7OASve=_k5G2ILs 85mliey5y槧c's(9]ne!]ʹ :7$Q^ar:6SiGXx)QaTJW{8,dѨ˃f!Gi_wcț>Imu-g(6$m=κ/QJn4P L2Hn:|F\ *n|ነ3JݺՁ ?OxեМvm10q.M3`!BXDa Kb+G|cA1څ J,kpZ`Ji=f;0ue&%LZ_b%:+cDݭ216'<.t$>i`ZA ,x4X!'A7S%Jt@q$FaU-κnqЂc`_)UWG2ՂMc7 iBo=wh{F8!tU 1@Za}W@A"'d";  5TjSBƠJ-8&7i UH<+2r@K9u?0](56f5t#{}9gTe`Xl ϱ`Ua&ixlt+7w}_6Cru߆*ykQ\.X0090δu#zW$Q 0. #V CǕg噇%FGQ&l&1ͽw܅x`^d1G-8&N%D;UI;`/0Pf! sTnڂc`p2MIs {}mB Y[Sq /o\QG=d>b{ D1=aGt qT'R"VaԈD! @g=LB)}DhU{ L%{<DPS*û_pטEvlǔmB70b=VGᨇbF CJwܛ.gjGuܜdJ2F82Wͻ~ŐHW xuC.X aGɳ,wV:-8&$U1M-Oډ7R9[p Lҿk"z!Ȫfm ^BmQ\ #W&D<ƙ1lĜ/á1 < U*E솾ȣ(ͩp LN9V[-8&_~VIX쳴8*gmP9q K3?+SL !tI((<@|8m*g;ʾ\]G#, (v5-ȗ010qÞˢ7 =g TB U0~ǣjYUTX<8'VsZp L/6@H/9*̮cւc`∧GXn- G* -8Z$n!1X ƠC`qV2S1+'=DqYQtOluhIq}QqCP&V2!xL"kT72 @L 3VjZp Lnn}bxݷ]i00m #q'kZNh]f<ԥxbaB0[-ײs5j~&=~5{[@Б" L, DM%Ώ+D#u3ַ׉&Ma?q'EZ:6 l 2bcC\ߗ0MVXȀ^Ra@\ʵCȊu6xjY C} (]B qyeH#wu}E)lz4)':;дYt嬸) ek4P=ڭۀZs^cVkks*^b Goo:[_w|~d?ld$3s..[uqJ_p<ލtfMޝI58G}5 Ã7ϱ)!!T/̚%tQ|5Gm9Zvc;2K*,^xRooklBڄV`jeQI}TpdGɓx% =n E6M˧Ǭyk:sVܘ^En1(4p?׳W˿2~BT4ʀ;4]mj!5gu+ۖAVmZzOq7U큍_ (g>apaXУ/wc F꾍Xgrq1KC}-㹁V8  gGё6Klz-+gK ch$u(#׾?܀ev.+֝ x>U4Ⱦ9?)+&૙hl냔ݘ1cFܰf,ZY5hռƺͦA=v%vgove7&vgLʍZۼdv9GIs74Q>c#T3҅.RinX[=>> p$ 6D !\627wN(0P9y%tK|5/mm^ٍIl^a{rߧQF#ְ (?W}Mr2ߔ?˹!yc[ekե0 xHȣy=ik E#!^?3H' U㶇G r{s?3rVBμ.L‚h~P&OW^%vUvcI0fcD(V. #ID5( t~7ReuWum;S]"1qj3M ΚI ZIǓ ۔)mطC=$VH;ٴX3 y>Bl6[z&-Ԥ !,1!$L9< l&M_/nOiذFd PX2a{]c| vkvY?C_NG !v8~{(VXW@Sk{q'nL'}k5յYB>-wc 7{h`>It9g5̣F~rmr} Ĺs]p|0+~_r)-<>nx%Z?äQ`g]^df]2&}߿y6reAK4wesEy ,N#3Ўja?fge^aƒeix,<2?[\aٶW@2͊NeOI,/f%3 Pڜ2񔋘ݸQ)Wo*R!2jYff.ӥ*$`=]$S*Nw4JD8 f]wBRxi}= <͗fd+}~fhrHV2.8>&+ML{^<,Tcsj~0e˻"S`N[eOky.~+`/UGÅW[J,IA,nj)l{p/)I*<96}bb/;S>7͜E~lb^V0w~;*HW랔3xX*=q3tՒSX4Tm|Ʊک5(biWer rÃgơs#DCacjF=+ P⊨F1C̔dze|XǏOԤVaw-)UG1K^M?ynŠaأSq74dXFӾ+Y~ 0 (tE7-gl߿Z-CI'ktk0!=v˙xj:ϏU72uYWq !{QV{1~VosLi#G7UH)4uF@*F"0DtߜݷxZ|ϿBF)r0KauJ#V0 [5dWᎍ/a]\08<(-egѢ2*Y4S쬾ձ&ΤV,`ijG)yugO #55@:-L ϷSгgGK^`NQ^鲬; w<ݴ8BJhuQG\@ ^J=U {nӘ*]cŭn0'| 9CX"(X Fy-6W7ry;9gqv0F$e_vJߒ8Ń&HPHDkڃ4<đ[HjON {7'gqkNB[ErQc2x|WVx}qM9;8sl{~-k_"墘VPrH)! 9"J^ΎP)nIҲZwN  YĖ3mH@XGT-/oP}hHf",̟ؓ:h'zɣ}ja4Q D e,Fx9xW ; t Y\BT+䉉6C@$UqF74覉1ŽEcovje4cc2Z~/} `,C6KB=omDJ0u:/֗o7߬R"}|~5Ơ"R>ɏ r5Pxbǒ,TBJ3ju(lFZbR> QDd}2^,OV-I-n!Hbm/.l}5ZݥwKwsw>K%= yBc3xN|ktn ӊCdK=Z;bq {dA]}xwuyWow}sw3KFWR(Н Scyƈ<] {I@zϗȇH>Qߜ䮒3fߦ6"genlz rsJ->1A^Ff(^ @9B@BQEB#<2228>GF urf|IG͍^.ml˦t4 i *N} Hu N(B1+5^Kt[(P]򏖡;\-#eR|ZN6II+ϔI~0S SgT [:$'*BѨHsqiAe4E.E<8'lRJEgߚ7дϳr~ ?".?F#BۣV$ˉSغasE~Y*5ck!Y3ÔK_͹Aa= "M2Nc3VkksصY|5Q`Zd)ٕD`a'9gZ,cEWda?Ͽ "ǀS * B$wv Qxjg_^RwmTC`BoM<a9IkOX9p+?e?MgM3Eq`@M~hλ:(v|&csM^eYwnhSe<v9f+iiXUh`BBZ*Y}ɉYxwvilTDSNvYM![e5oNנMpRm E쒅XXș>D4[d)gf`^3mNDDU 1/\ 3| )66 f_@i n~T]l`WRk8O[.ˆy@ Wp ֯y4k=_Vm{B=͖Q@ɯP~5[@@(A^`(R3:(iﴏ FB HDJ+ČM Q|-gl9CtNh4I]psR3n_h6tثpE)X#:FuEeyuTC0X`R]d?Pu/[ohGVɒu-˵kUR^aJx6r:Ӏʝn9٫*A~U2t~x$CZx|"`9ӬaTrq$qV3' !-g<0BlNyABEOxoS݃0SQrO#yDwH3ls7ݺ]Zv+4R"[T3a!#YMi+ f: ¼ÀY9oUg\~Yiy=8 %C> n~dKK|_#y. Tt+t7qC#,yع\=+E+nv L zKEY7pSTZO0gel6}i>R^wȈi2Gn/Bȉx;A }w)zh)uܭS1&J-kӤ\,-^*rJP%-+ ?-gW\k?eW1Ww#K.eɗbG?4T)H3`v}G ttNyhpx'nxC g{̞IвO{K׋}r]ମ8n8H2R2exl@k P\I﯎<^WKC2"Me9zF˫eLç0ΖU{w i*(Y/[}Kz#usBA;JP;] ţx"Ш{  |V|t|j?>go|,j{2F@ƓnS (Gї`"_kOG1"zH]!YIwdCYHt2j.DMpXeãI+r !2AuMyQo NK^92DHv9z2}Do@#)(vcG=d`v>[L⹄ZoȢA{D i3i  hN2(#tYƏk8QtsQ7nN:[U0/V\H g氤͇uVՖ׶bv:R ^e׶ޏWyV[Y.*'NmY)~ṭv^, wXIq[x Q7eOxda(=:ە@Gφ 5 dp"j?N PuPC3~( |}upȫ +q&H;zrijXJ0ğ[hj|,(j 2){=gFoR{'G]pI$l*E.|Q琢D)d(tMSyއ& "ecOPgsr$`!(tQў E"LRΪ1&X(KΪj$čiU.'M> TKD[o&b":`90hHFo]1,7TrpyapmzG']09iFbfQ{Y4 9ەD9 4P,gǫ^N3NR8EC1 @"?Q8] i&ʂ6wsd>zVLJP`wJhE^W7{6Ew%9 PcqD@qfRn^lgrFat`^G>p4Z@b退&83 \/xQ7Qk{Ix4K:wu?0ζ ~D/~FyQŮgKf~{}|B)*'i257pAatߊJ^/n.}c+1sp!V(|:wQ, Ij__ "cį0=koGe/~?;'{^,=_csM IYr"N%J]YÁ.D[,O?^TXRgfȆO~f9(8kn! zk6(iXb6q9EWrxEյ:p@@2DŔ9t)UmTQc"Tr mCJZ?ևW645FMS\w&|=DtiIj4{f1:BzĈXAq; jSRO1-!i4Q[F"$RFP.29pƣAVLٲRAGdsQ"xQmm "2I˦/Mj`j TGxNJdK,=AT08:CXΜ|kY$c`h2hݤWeN筥swVV)!CxO9W A޾a4SvGY SIuIxQv?PCi(2^1ŽEY~9VC&NK¼31"B)רhNLɓֺA? 2D ( AF"r`Zmvu&eѳZW.?hw.?뇬-0&N=جC:h& H.GEF+U@T' <:2pdBJn Z4{pҫs L.V(h)? E&+_lnnw(62[` L^8]pe[wV g 4~P];\u F6eHn c`Ĺo;܉d8Z>]κR=]6j ҂gB/GF0&Pp!A(0qze#X3kb eukU1!GU/gdGFG}OG~!lЀb\[GX@H⽔>[v`*fC؟5Gw K f=k>o5#==hLCb5uH YkrZ-]T!=p'+rNH$#AA@, J#Sb}Ct5.}=MJxoZ'j}^ZݼB#֖r990X_ln't$&ު2/DžO'c8b/vPШj̕% :_>&()=fK̀ޅW?(ͿbFk(@,!qE!ܙV+4$|.Vl5!uL[Y}|ۡGX=-?c:M%NU/{0!I;Y|K͟U]aӏ6*lj/'ߍf~_\MczŰ}9m'~fq>fg()A\pasv-ܲ翝?^ƓŞ=Y "blg?G;hGӗ)a}?_}=f M.,|KoOϒpt(᳘L W_?ӮgO~jVV f~5UM^Z(v3|4i\kf< *jzUРFpX>'`=#G'<CR%oz0wC3L87*sV^Y6K10V-`C@)kQmDc}.o^zC7d;u_^LJ ̔J5stx7rx쐙Jmm%&Lj_E73Swx1+\ z`z=wbY[`=gvM± Gh<5$3^7i0BR"< X0,"Bt]M.&\j|(@PAߢ_{1HPwe4w\4e|sc,,e㷓`>g<-D"jOs pc p0uq,1SNs<^9&$Zj%&=o35޳Onxo/l&![UumR>~>'q Wjl2!l h!# |O<Zn#U<%S<%ItpDQ |\1Qʹ6 גJWGKĄ3=dKw_@/ېvamT0yc"1fFQM+ıD/i?%%O9R(V<]Þ[t+~޾o;fPϕ/-Щ8hlCkZMZHY"UuCP//ɘ5UGcv%?cvc7fiOI)Г%!mM7_]TJZ^ IVĝXQ ⨱Sv\o?50g8u:N/ت&VM׿N|-q3.@*oЭCoqj"9 sXk 帛Ź@Z8kυAo;]|ާNwwx[W1=e;J*;e8_5L9mz"T#Gv|ngIdQU^V{hjRf]}nW_'KOa~/[NW9U9WJUh-5ݤ+ ݺn55 `_ڭ}:9@b{C l%_QykrcȘV]}nC՞"ܬQP~1PLdž*L ߠRP^J}:)k[cf2mXOf2mרcLn=3?m?s_JJ_uVƟ©?-N% NqͰBgxEOɷ,m61Q[£g˻UUsK8D¦yߟh`%UB !OkwCx E'+0_)%ی81.*.NS. W_]Ѝm YRmq҃нedKVV-p$-ʥ!7{U׆qyiK[#%P酟zK z r+]ӆ{wmUWv+ènL]J?}Yڊi'7"@dhS3e5^mre_h\KXiOLVMON?xNOSi K'=\#EJK!Iu6 oZcbJh UڮVbn.x2T_J}uuP Pn 㴲ޥwPBWkLEiP\0&1H<"]2.+B 9Ո׵5RQD8B~ 0Yy´Ť]aLr@@ oxiZE2Ľ)5kJ@>cSJ= 2DK'zHabǙYly\H!=A< {崔#B Ŧ&='8hj8z1%>**;XrhS8"H*Z4 1.x XFwhJ )%#n$|jՎ3Buha"ZNbF,vDYȓR ʩ43V5t+A~_O#(h8LRۓsgPyË+bP _{/.wPϞ)|X Dٻ6$Wc?Vއ{vcL~qcE"$k}"(R)[ ò\LFeDFFge`$J>%P>H1qi9xw13znF(!əĥ\">V0ЂQŖy"k쯜FCNCd؀5UԓHhr.%JU.Z{gz$ ѣ -7ڷFx.p&e4xŔ5A7DB խJ IMk â`U9`!Z(PTAJce1\%-5Hh|NVy Z'l7Kt9F2 x@HZhPc\z -?Nl|@ gu64BJ+DBaتƣ:'$Vn&lq(Wyzqۃ}X#WV:U緜A|f4pS2IH F3=DB tQpQXFM8ulDէ|&{:7:@_ K d*ZI' .{ -@?Uii}qKS MO.GbE/0ŢIO(ol4U<4W4"qg+?DB pg9Ռrd/:zU@"+By/Y#c)#o2S)r.u$P^պG.nD(.Z DBy|h:hS ^;tZLմ)0)YbS~yfLL&|#zh z5JdyM4.W{VBy䚢)ch*L`d4;o]՘j -9d S"Qc yK, -GFۚB*FRDP ʊ U(,"H[CVܱ93A=嚍ɂ=n;.ZU)JRt8L9$9C$P˴y@ZLgdZl9!(/X0]MPmBn{ M5+5gLD#'}2 tQ~ k[W%1:R-%"D !Z(/msx[s[jti,-j9q o +/:=XagsCHLk+x{{CBQ 9+x8pEU~3҅SÕ<*lф+>~1udqi+<fTE!fM^@PyReH ѩh_iaPUb(޹ N$E8Ьf>XpT0d!qͭu\VẒ@oԬ<^\0sW=tD>+aaAkmDQAff{hJaǔx){7옴TP6ƕ(\:T0Fu@+jQ+@^ h\ 4)+dS>xILj㕡swRi#ȥH+10T^bmU27fR7襹:!C ARv1\3Јɇd4*3/'oƻ2fQ,SMlfKL"*\%6tFE̵*"Fp*+si^F@gZkxv 0-DŽN yh荐 '!_K*H͇o o?/lLvxh2_ݺ Yr*5tRY5ܽ hnhEBrM*lGYtajƷ$_wF8Ly1C"y0L wGwʡ& CޟX;C9 @̞WB^RC^2LXx1ClD0 qUL#[khlǨ#3 yg^ŐPqR#pO@/19 -7H|^"x-T2 $oVQ3>^9½*'"Ԕ ɈapQg(=v~U@oS2i$EFHF謫,2Px@m%m (!d]UݓeENd *өrv7~`9e) mC zL#*k D`NBT(c1TFZ$-#?{B'>ݣ֛!.d8Gg{p?N8 W$TZ{ٙNWZ(u?GtZpX+O睽fe>k12=HLjäR1)1qx3j;HIld bX.L-Kdf8'̱t@H! րԴ76"ӏQ(@7X#8u+r$)3w?ӳeUluH׼~ׯ_]Ǒ.ɤ/'7~ r~.%JyYi\R2;L . +Esniע/r ?MP`>PNC޾bXf=vU F̾G77𳭴|ggG˳I7r/p9=CמM{yrsӝkk._E፱σ6ޱN(l;j[nG0ya;kN ۚ2Jc/U!ڕʳtզ2fNI0Pc2xSAg&%k"&ѧO?n*_n;|l'Bkߋf<ۜ\-nxEg/lh8?˔bʳF#wd4y˶d)le-];KpѭX=/;d)naG+K%`]rUmǻr#Lm4~H&; Z>l &-~ _ {˼YY칺2W޿'s:/p1Y`G߶??洒aV>uokrlz"4a;v;)T4 _|TG> vI/WMVeVcZJG__rS4ɾ%$e-t@A\HqV?iEfG$h1oKCՏgx:pq ~tvq|6ghEeq?jiܳ[o1 }'V}jL_B;` 'pf< fQwh]Y3]F'3tx)?hjr! _@QZ扦FM'dF)]=TY´R [^a=Hh¥ܭNωFjA] 9ؐV%KY)ATRv֥,JMZ\Z/Zptϕ=;k]fuGnh5ꯋ>.|<*x2_r4rfpip Ҁfl'[8d}ȒA6,.Yl1P" FktO"8-!d}V9TH7EFpZx{>t%6vht"hNle19CH/W+_',.'ÂyL8(%eTޔlQ)R9SJ1%w2kNRkwHo?*l #0`|K{~gmo65TlhrvOK+B~i0}Lc{zm𴵑%$;NǪi]KaTQeUQH}ɔp04]/%RZXbN=6 gP:ϩSQuQ`=jkQ0:z&|BI\!b"9x) L`hR(XT.f_Z>s}>|c}<_3|-?xL<w$֡޾E^LޡZAy"A=8oόXsfxE0/- ^A)?!BT\$#څhq*8P#4Z"K ّjNqŘ_w[JzU$uxK] N8ݺQ]^0^< 5)m`m?snNzlœJǟ\j?&t.(XC_N@Mv_;.Lħu:_}%Tᬗnܴp 3gˊ9Iy[ۋ,+|j)FI ׹5Zb{#'/8uf]4sf,y<\ѫ#+VQWVPщ'ֈޫ ]A=˥ n'vdK޶ ui +Vxt[nq8*24nj=c:Rg_ߒon^?D݇FV̦၇dΜ{\3A;!_3hvSަLbd%gsųe]4 @'}틾c(jTv--6Ln;t`)؄YKgL?s*bgZֲKכ4lKnr'ĥHShC31.y_B᧕L&~{ ™|^ F͍5ukrIUnd7 vQȤ[;䴳FFC4:᭶&H\Ҟ-QR%cۆ1g.蹠RZ[0wkuqeG 8D1GH%s4(c;q2Jp䵦VDC-.$h-KJM5ƒd%KrV/A _{g}+-hYvφ[jqpG?Clci*axW]pv}¥FdDsz|~_6\j^q~Îv^6q zQLb#.ʃr Ùʏƻ5z 8TU4]`.W=iăýQن͞Cs}u?4w~*O`&w;ue~|A;u{a1}2"zI&G6$Hu`^\J%m)KrYy[ .𘫤7:{.*sz+hmĔCd!b43:9 828ZiI\O\< ЖH!Z”0%)!L aJSB”0%9Fa =Xi8K{M aLPro$p!nC: Q t7%$BtYxBX)ii\3Ӹq6>tSO뫶7ZÛ[K.$^ ;oh@ [MXq)IĵtD)j34JAw/R=ր>3vu}ƲPޛgpl4S4A dp!ԑ|L9b %Sf t^0s"DG#q` rZZn YZF<jqӶKo8Elyp|{7ABsϳxoFad7Bmܖi:#MshKqI3̌$>1 2=p6a}ӿS |EIa(O +Y! L-0Ta* K&.iMԒ\DVVcPLqζ SyV y~RMQpftxݻn/l68 6w{Gaק/)CkϭQzs0іNm6,ӝ<zFMj'df 䢶I.QƗZ2|U?\4ŐR5qe#V)7x.q+eJVV*PƔukj;DU9 _Wő,Gr _oӑ؀[c I}<\+_ʫap +X8!ǨE`74 #$ō%Q8_Grp.Љ0N4Q/c*8"'&}\ń 55 F&l c&ˉepp#f}&umvD~yi'aqI4#M+ u91ZFȸI~4QJϥe(o+۠ h5l7+%9R|. zq4%9v q<]/])\E3(QQbu! ~o+atL`RcX{N 0Qb &R0c(YZ;B_j^>s8rX>C5+MEzTL<w$g| @\}7no0]r?.AGJc l!fmDm?mlsxn.1(xbGj7[+*šfPkm;-t/'N)CW1ǝjkHGf|,7j}>qGM0˾vsOv\K^j>D9+8n4=Msf7UptŶ=wq9f$c&ے-il7'//4O0}R?wp.1il;@!s._UWVL N .GvoIU@ E c.>!ڡ.xk:Ըqį= `B%. ɘd熣ہ%y!t:ꯥڨb^m AHRVa S띱Vc&ye4zl5BZ"ܸ'm| &_HҔtXj d ! A VDt^[!GxI8/} -, Q;7 qfAn$ 'H >r` !a:ddL+ jbKFb$bA% 9Ij$f.{!ck;I$}JIKbKɪxfzU'q@}ˋ*q;.@[B>,R'1G*g+ a*4PPj5.mmY& {悴"PLyNeAa%^)זr(;id}{I9Ǡ98mGs+c!Ȱ, Ƹ#aREb$%J.4I`pCYI2Iqkغ }>:y(I8s |>8W7q8H(`(p|YJJ0K* ,[epsxxݫ)I) -NDn4ArF\f0\gy߉?|iuQʛN}-;2/8M^Z˛*5Vm Sv%VT%ޥ6Ec5XKk֪P6l]u{H4Ilط)ipI}9/llVU?̨ypҍWNp`9QED>]'nzıf}_88t:S٭3RAρkuusq7ٗ߹yKVw|Q%ᡩxx#(eG\Q Ui3$xmZ֭(G.{^6X)y?7z8*W@"/?/ _2-RW_ t#/+sk˟;j@EO` j V> xXӮދ/1߫Lv6@k_Cpʢf$s"B3)ݬ4rXUmRj3 abzOv.m#qĥX%L=E9߶b]-kKyg$!KD K7r§R>/QűKTѕ;yKREc*^F8pϗQa˨lfG ԕk#F3@@&fa~ Ջ{ABexϓyŋ̀oGׅ/d3F a6:8}Tb[Ƣc\7Nܼ`b?0T+(5=V" 3N';Morw;@h,":Smki`:8ڝ ^vvl[~ɬQWlr8VvNeAf_aEv9 !erAy$V+uB.6L<;۰|Gbs@/Y>u%TUcWWJ);u+*bU"X DZ]%* `OH]QW\NE]%j]%*՗YxHxՏ__̹Aј+ǟu`BR)d.B#m T{%f4s򝙚4"J 0Hi:"LH%&F D-=S+mw\Ani1 TYa"x.'BHۢq%ۨc#lcNpJ: A)G.?^<:KO 4|1Gc`N@p(FR `UK(/!w^HA:qP`D cU`²:L4(Fjsr%PZGM4vrJE/2\(ާZ"Ɉ40%a X8NK9~zq[WFZ}0_'H3k RbګE3^fԻ 8VO0FGdD6((BA |Sa@S&>Y~ 4B(#aZ-svK;/y# WGowoئKWpp?K|+h(JUrWB-*_iYj^Z]@pةUpg^2aHa^!X&HSE57V"wQ'/F|Z8;9`frk vE,/,bkx˅mJU`8Z2jiQB],/lPqx0ΣYѼ<7// 6:5F@9lra) p/8r:hN)FfJ 1P0vY4a#A29HysQ b"=`)3)Q /x R^XA-ِZ8hqсŔѽMZi5̮(Dp@ GRb X߁Qi6RԦLx"EI$N8MCy`"{&CDiX{))qJ7Dlhg)Dðqd#g;88Io\(f t[rt|vǷ4r;{^Ix&V[KgA)`YPb.'a(ă7Lz6p훷lNbo߀vjiIa.qU?dj9- -AƒEiCΫū[ʋ5a |].r,օr**O ǔ͂&O0wg5rG,oB# ԥ&{ 75uSDe*[6,rm}7 As+ (h/{O}lt1e4k}ÉūL{ށgvR/Ucݙe@wKTQL@wш"E:ZCw]O'D"+nJiSX]pJhpR':Z[EĖPƽuH ÌМO"֡gp"88!|&pyQ]Kv!#_q[|x kѩ tx@xUfM̙uu0gx  }E#,J$Ew+L|ȄD-J0 RB(=Jf$ټ0U`B<UX-zg՘IDk1h)JFHKDfpu9 oׁoofOu8ښ [ 8/M?Nco-LH6[pi9&%&r#iLu$ y4Ꝛ\ń)]?]Twos02.Kn^r@;[iytr-FnܹRkͮn%܆燃ij|uJR(~W,z1w5kΦ?urfi O9oB:@M+ZfTL D.?~}7;ٽk+pE`努F˾1uc&i݁yD=VK{'&tت5mx`ZBێg݃F3hS޶DŽJӤ% N|ۓZs'X'!rx'z@{ŋrp8ʹqaop+i0G8ei/CZ8dd3dc`nqȐT%zYrf ~.B4foDMiG+hSfN3&'[={NPFmե?Ԟis/~6lT?^Qi"k"U~^U laeJcfu|_K&ļCjø:h"(F@ɱR`Pcq$JxIu2vna %iVh9_m ]g+xl%:Xbak,pKüE2PFb̪p{F1ZleF͝QFG F刊$p'Mz,A->X*im)Y(J"\n<1trAWW2]nYoIQ THXe`mKE%\!PrWZpEf[)('r v^_Dyaf/U/icrU]ѿc.o au`{1`F A VDt^[= ^0 VyYA bJ <* s~]`FʩR I]88U4ܤUNӺji'=Ugr;M56-Z~}10'g ǽ]h6=5u7{ C?VyCƱ$`H> PpX )c miNTNya-B}"B)!r=%bdFRbA ƜF*e&n˸K9-63^k uf[- ? w>[ㅵ [<痚ojO~i mwEu:O=L7sԤ'D0{h6o ]0&'Q陸GAz?_SJȎY?@'#z3VV(U1il8j0-PuB%a\[*S(D$LKC#ڂo"JN'6Y),ӡM(rneL Ȣ`;X+\$JքD45&,tbIW]pUZrrkG$!Hi=/\(i"b)"bX t1qKD{DdD0"VRK./\Z]^WK1WJ!~9\}s\c^J*~ +{^z\L\I %^(%=/sj޶׼zL8BnRM I!7)&ܤrBnRMJS!7)&ܤ[XM I!7)vrcE B!7)V4rBnRMR^B,$ ƲAh}T=ubB ̝Uabbf'yV@E Nx5*8iЛYT0A?LOWz0uhcܩ?miv:LW uA% 84Y[( f rA}7~ ioan:o7> SSߦ'~&1鴮էhQ׾, +à z)E߯v􍿦ٹ7>a+ a0l°Q6 F ev,݀8+nz'r'8ABVzDhraTq7ns[| }*5:^M!>txˆg#BXYL^jʈhA #(H8 J ꓝ<٫Mmfhi}/̋/RߙZʄ2a^!X&HSE57V"FMsFV;|yْ\t>tp)60' a8ZݎoFBb<ZY!RTࡢK¢DRC~;<hA q/ 8*JIK`dF2HX% C {jLi]:k!Vꉻ $]uo#띪1=uOyp>vh6B Zv6z^4y٣祖t2_i-vwVԳ|Gm2M:FSlwMwk7w m6ulIMVbq/r8] gcslLR.(6%5N*,s L*of1Wn e*+g O^v74fCU[G߅{`;F'(8m4=C.K,6útМR*/!P0vY4a#A29HysQ b rgg tWT]hx&L+oG+ X|kol-.%;M0GpKxzZ)$9 VS `9*i#j1Q2mpFWX /X՞ U" r(o6,Rg2D'HkKI S"!&&Av$˝M${ǏU^1y7nÓq^;IX=k|}ΏG)4Z 1 J S2N!AJW!2!AM4p:.7qnV/v%ߖD Hgiԕ јfj9-*Aƒ EiCp離%8i}&)_L Or Z3Wx1 /a=|cm&Tih;'+^,%[\qνFЕ+A.G;('W[gfX=v~:|lUl6N3RTrb`*2Lb1:/%cL3Fu|҉Cjø:h"(F@ɱRL f6X0u;p}\r ux0>moU>k+0mz6:R-DGqK y͘e=B`iH*XY5nϣUh4BkM̨3hAܨQ1D5T5885x8՛=GP.VJRgcyD',5]P)$S -奔Bi^ -IIP)Ki)$>1CW?pU5<  Ƙclz?Q "iU$T"_iY"CD /aw>7Ue @Fpkw?rܵyvܸ(࿌uJ4bA2Tغ4 :KzyfWI"Zj;:c AO R~F!TD:spY7o8$MmQ,%nrꪵzfׅJK.m,ssIMQTqkTeʺUeݜv*]Lt]m~' ﯦ0aO=yeķ$DC}{@u蔸إ[wR5y\2*jIUGM/9JK✺xV8L"8&}͍Zǜ2 ecSFkqZzO ni8/V~yMk%;^$(Dg@D eH_< YG.]&bvF[ueRх,||NFC&1np7"$u˾no6}gXv%MFܛk%uG`J0l[H /fSclV|Bup2z>}1N][ԻN5B|H>A.PDlБDW$1 _uSnKPNs]٭,=m3<MGJt u@-9]-:R_}qۏ7LsE~6sy9u9\NN__1sB<˽y q0;\%ʧ;|r.mؿkuoLu[R{YTE~?'?H3uݕqMWմa^\Fyٽ1,+Le{'K\[~8;-'sraj_Gi$);NxV6Wm7 >X ˾=~`?yTŊ1َ24jDMͅkjlaO3d]9]j̮J6uoZbmMiͱXjmNws7K#ٴ6ؖA5ﷆwm5.Q2{I5{Y NFD:FqX?-Kbua5Z m9{,/_λKc0KDjnpGȖ})sT1#gy^x}Ch1qs"1K_ō*YSRK1!#[2" F●~QC-"}[8#8^ydk㊳ŸZrȷCJU:N4fHqHP 8bpI}Hs@Q7(#M{~HMuG:$؈b+JZ8Z)K_cr'1 Cw!9FZGdsH g !ȍlkqJ!6lN[-t]֒]̼e&TAUCKڙ)B_2$ UCs.!Z6V|MxȴX!93L&| vcb fn)ȄV5.MG;ZCw4-,jvFd)U((VٯB&T-X[ a=vh ԃ |k@GbGi$(Q*T{c<*J|XBx yVA8 u-XLĝdݴLŅ%̆mEra6-b@ᾠD}vv j Cz4P׃8>;<_M#h99XO2Qa:TͣAsH7| C爴65v/ ]TFYQsǰqZkIٺ!XQ"4Fx31`Ogoȣlk8n0=ṕ\9@؍[)&:M'%Θ)Gup l+ )\ѥEϡJyHsC ׷mכfg5¨ x?+C Z%6,'w ê#h.(j :P87mcdvbiFЈ׻޼wϦkP\36Nrw>3ZWj $cm r %/N8m!1Áx*C!?\Z]-kPiRvsy3 Gz*~ bp(` E̛NLp ve| V7 MvXQLJKFd>)q4gK+U]~AWW<2Z 搂1 Vq|^0{ٺ,Pq[Psu^ aN_~Gr@r!v^x3ΧuSDnody*4c'2v|H$PƬqBnEwژ$Pv(~J=B'J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) hI H lH!\B!Z6C'fW$#"| cRHI %RHI %RHI %RHI %RHI %RHI %RHI %RHI %RHI %RHI %RHI %R@dB Ds+ƃ!Ym@T$c$@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J=bhE! p0$Pl'J=F(1%$@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@zwk}ݿz_,9gR:nO_=|ӃW;d)h7BG<)f׳-r!aA{ []jhsT8g'/i+)g \!Q¼rb.7ʲxeaɸ l#TlJ lc@@z?g_߸LG'~,⳧'y!p[Q9>YJWr`v,IvQ^Nww鋃C>d|݉`L0Zw}EoeB93W|#̽6IGhr2<Ih^. y&ċq2ଓXoxD?ɓ^|Gz&ѕ ˋWGO~<8}L?y?caZy7[I+Gs${̐ۻN~V]L&֍' h]@juQ=,M{W_zзWZвhvn['=/y07{nj}Ǟ MžM;:aQ[{V߇'rMekuq!Ofl;_6w m6wOb2qK}#1)7}sCtEV,?&g?6%O,hrD\a<n||9j<8#{sǭNؽ0 k]k81WcϑvWk:P)*XRmHgåXsc_HXh7em+=+z-bcq+jeSNKC`&IJ"A81QrkN~:" !G&lib2 +h/Gӱkϳ!ӣ]W:嵬hpλt$z&88R%v)\E8qX$<*bpSN ςc`VؕRWǬK̺rKKIasI}ʝaG1QA1Ir\s嘈'钺:a6t*wÙRxÙ/ն{$Ak% W&;=;|mj!x:w? 33 IzXrBI*{K6<5̞T+~zBzyu';R6Pq 0E]ݤLݕ)pgWUy6 є* քK ɥT9>U6ͧpR;>f0 G,o߀}*+V1qb]mr\JQgluar>j.2ŒGȇƕ?ݴ{"~poTtՓЉ͊}(-iJ?Yє1o."ȃM+vp02ZvT@D )uǶsQd!:h/s!I ^riSyGXd0((%=2k- Q ]Be*dzzn=0mOFОfbyr\C?5WQgajmjNoFIW:׵+8 XT,(U0xa2M5RZB\^Q"_zf4)8mc ' p/8r:hȌ^Ii3lI-^y|ÿ|h&+ 8dq?w<[ГmñCG'BG ,V^O? (D(B! -,% I;pԁӂ#j1OI[_OTs)*bXDDD5βe33lG ǣsa~Mז"q vO1:`\8BK?i|wlLS\fjk),<(0lE8!؇@AVp8gha!0 :¼ z{# M3Ji޼헛l~s"&>E$vMbu)]{bD]b0+`y`U]_k''lgy#a8:uf08UQ7zܺ:#)tS:񢄝yG-W^%2\J?=ѥCN ١O%YJLa#%nQL;ш"j[jgtɋ痼8hZTf(8e|*JA8O *u^e'3BlT7`ҟ Þ1-Yo)ږt oKygX1^=^5+}.eU8*HAsiS+]ZSЪWdLK5nP F77I V5ƥ+6{aR g76J0M]:@Pկzo"Wr~ηcy8z훪yd>K/B? eo%pVYѕ<[RRiF@0#b2'rZ=~R.q_3O7aR|HDK`RGTPJm6>ɹDN{jƍ4j)RS)ܾ:qMGt_ˇmF1d0l9ʪʰ5-EǸ.ngquIrH8+PכB]O7OLj9+j$Zy`V{&3) eec}J2bfٖ'|NB+7~6B2A3抇 ( $H\0q1išs\;sugJ dE af4jPӈ!JhDDqQXv.~֫Egv6 7*f p jP EP6()K)S>G]i6-?)bnzKHwux|](&3 `hs\KxMJE c.+@;SXw]Ɉ3vK7QfYzlpX%tcX(ϼi8t U|uN&!D!e!x0k5f,`k1hFs+%"[љ/|nau`{1`F A VDt^[G :* \a9N`.0 |# T)DeֽEM (#wU3N)_kJ˗->yin7u&@{b6-iaON*xv3ksSwm b*%!)W!HGE(81R64L*'}eLB}%4ۂ N9aLm7XP1g&-[2nG)'vKe, U' ^aX2M6t?ϒw[>y F~9y;0F&x$Ѩ.uc^eM IRLBDN:ٚN%L"88f pdGf9AQNJ&{@ Ҙ|XcoǴ Z(QHLd$&F-)DXҠ NFbRKk8㴎ia`*lکU]uyʬ̘":w0h]?,Rh4E*r$B,8ce!RJ0^J|_` ֐G=sAZn<'Q*LA@FʵeJ N'6YoT($ѡM(rneLŨEw9lVJH$Uј;9$HbEBp.؏[;p?P'l:) &jb w MX5ޕ$R˶1]RއcnٗyZѤö}"(%(r DW%2"23"9NxQ >^Iv.,_h;fO.DuyG[p`T@JjmJaQ6 J*˽uw/Wɿk\ ..B\Y\L_5/h/b~-:i!:Cn;jqz[I>K> /M5*NqsQ`ftY'7,}B`;YΊ3`p7s7s2 ݊In i+(ecˇj% f'Ϻrug≅60^7L+Xz[ՆItQ2mK@hDdmqIFwh¸rB[&uջ+T.u}9kf\q$cv{ E9ͦwhms޴aZĖ̺I7']W Pj3{o8Z; 턷1JUl<͚5~`޼.T1։6IJZF݊I @.`K,@;ptt^3fs5X-Jx4:c *kc&PfTet@nT"IwiY[#g/Yϡ`0+n|u^ST)OWg'Rsee3)\a/mR*pyU9L. &cu/_zɱ$QK-9+RpKԿ {x)}JR'9A‹^NP0vYɀ OjRz\Tkni*>?:T. O}c(#&= EoOu%Կ9~Z߆AzhWM}57W@}e5~KgfY5Q+3:g) վ $L!UCN GpO3$bkM!}$~ ; G5cPaI0"-`A ?O~[k[ժ ؚwl$ J%RЈKfDX3((dQG?_~7E"FYH\ӌ%,cy a/jUQeM(&Ϭf tPI{_Lno4YN@(Sn.Q?<;4xیGr?Oj9- 2 :ټqbfG2 _~U|zOAR`Lu񃯽 0Û^?WO`Qڭvgwɲ5nŁ nEq^!|:Zm5#zx8/ YE'3A%(AO!iy1a^_7i\&T2B]'JAU7 H[]FS%ZbeP'5lu;-c1 %.fT6dʗueBP[`'st%3اcW8x_pb+:Ǽd&`dZ}mvY*fS|Z'|V`7u[^Mn_rx5@*<)ÍVIql۞iKLYƘhLP[lA3cyzz=B٢E UDsZ`XG+ò IUp;oB!gh՞2#,RFXp4nYe VTV-B}ҹ ߧ9H6U[˪lfaeJɴΘ+ ?`+IdR!KN#b`Cwz;ߝ?Li1̬R nqQ;ĐR ȽԃB3 2 t~w:UꨋXI;m@6" oT V$26ԡ(lPRF#;u>qS՛`YpD۪vgCn{]r>WU2UzV$dXKF ,aM }W'?]3p%Mf :np}x}5Ƹ֜%=&[!D!e!x0k5f,`@k1hFs+%kG/>͏J.Z7H8fT[`eHJGy9`Z0*VX,@(vGE38 7PNbXXHJ\˾vklp2]5ÔҶ'Mۮ$m;xr79}rURâ[_ xrϻ.7d=򣆵WD4?(dZy4c Ie32"}RH9#Ҝ2--b!GkQڄ&|YP=)oW12#A(cL%նfl͚q=Jً.l3 u˺Pup5Ɠ\pnxsSouSA/AiztklN UAS\ "(x8J%s0 #HS дUM !D%$It)-9T_R@% T<=5yybFM" E*+ͣdP_M&8rgc~|߾q [jeQL1EL 6T*7K4쎞WNɏ r\H/ &>LBU0,:T)OnxS]~Ey}m*_?o~xit:~r!jr5kκNjng6z/Omo:g'Fb~O狺ada71^Q8y%.'={- {edyA.uXQԴa>w d>7i+uSv-=UdIymX|݋Sd/_:|/_~2O/߿ZpF: IPFhwmXI\*]˒OX~oW&vh~j[k]O__« tau>#w3:a'H6ys6v&jj~*C9LV!1z@`Ѿ|~u-#]G~e8USɫS7$PbF/@} v{/:>cQGODk.2쇅;<߿ )gJ۩QSt3T2ZF&( z깉* wg"bzwdZԿ k/iCiZbPB bJ'U$%%F@QШ<kB š$.k%׶E6O=mo;Ex@W Bo :mx^C 4jQ=.蜟{n:'FGWhԜ?LBjhsIO: FuX匿dn4Ø }])i 6VNVbmٕ? ZAj/)wIY#+cCrtjA7qQqT} YK3i=t4=nJb&[vWi—Wɠ޹**$TF :jAWiU !A hpqfl>\c^Z;Ůtt5ӸfBQ}n/:ˉnzw<='^F@_Pt3N {G<^A9-W'8WN?X|mzPۈ{ *瀜ɣzjO~ʫtuӫ4Q_'=8\d_h9&G͠d@=ȒwF_/;H>8l43c69c;曎c cl7V3&gr׸+:S\Gkp>k_ԻSY^ ʟt ѠI @!I6CҼJ4=~~ ص߬pm(MD!#ͳZfؖ^=U3:]XU:y@.8:Ю>-5Q*QQF!$H~ Qm ,@pm9wI ܝxd[H WP{O]<3;*K"HAFpIVʠEO#urt'6DB-ME ?ygLD>}tQL8Js~X^9l%)a ? -8j~wTXs{>rsёD Љr$0IٚrTA҇*&h.!hsP3@p&e6 s,PKה+Fu@J>/\1Pծ6{5l*0g̽1͐M-Lxa8gQH9hIFhg8OARze+EwC3M4Fu0Grf G2hU7jꚆ~ )o%;U6#; Avs|o][|#@N+1ð)\eWE-WgoKwt$߳οF]_xnqI`yVO\MRmQj3e{̶01zA خb1J&ijb (w[ %QMJ\9&ٿXmSX]T~SIV- S9<u=}\g'O&'3O&:3ꖅhp5<ӣEHa-Ͻ<9769B8ajo,I݊j\T ?+t1$ jTJ\>|E^}+QT3JRȩI( =WT=4:اq,tJ` %R*&TR5c1r֌Q^ta1XWºPppJQSޣ\1^`yˡP? =;9>ӎ5h8y$R8iA2hRMhJbvØ`:15 t'i pgYmr'!(fJDN ]|L+]9wkl`l)8mZ"eilaQ}T;-(Ȑ"G&D%L{u>.$G:*k bܭk~Ru$+CшcM8F\7zF,1 m85I*ʃ py !UipTȬӮt(v)rV,ӢDYR_;5`aohp֒_$_-;c$B𭡣ktGWń.J(Z^9l%)a ? -8j~wTXs{>r!DG& B' $Xb^zô'esPKH!W 5A)΄لM!,Y֗RA`}j"հai Hn|kL3d} D9GED-#EV !F&qR< >JAZCD1z{ x4Ӱ6+wEok(3߶͞*m{13KuodŴX*gHŸNz^^lVَv,YyF¨^#!HȦ<(_#$"'EqE&h3LBֽӲpHK^ 3/Fw/C|]KnO` }}PSyLE@rc)e"d*-`"H"qڇp4l*$x[H b) E,YYX:7kݳ6B[j#D3H\& $c&}@wƕ#<&Nj,Êz&Y{jG#TG#u` `!g%$ZZޓqd+_&yhŀ<p0qyB"!T$9&6EJMZ@]:U}:М [EҔ/.ue-g9t_>Hb?tI.lʎ}I33ݑx:a'^aYwZּ>yͅx=T/4139+Dj3xR8.aQ")[VMmcUS[22)F h=bFeUR@C(=Jf$ͪ{p`pv@]D5]hC/[3i¥}dR o?usk@nmI.6I,*gk 8_gKlEj7ꇞg;;^J׮!pT"|B!< N0b|p_xf?ٴ9jW%h&V+P:N0i8ri1I3ŝKJaR8d|=@?J~}/>9&???&`ւ o@g`?Қ44W-Mfi|] &\Cb. f Ju~{KCvOtKijHA&_}2WNQ@*TS6](!?۸]ѥ6Ҳ6[p:[Ra)&M̰yw7P\=<󺋱"gexPQC{Njc~dƽ/?$4.1=VF #gZvTD /?mT9+H!%z84ƹQK.T9B ˅VG=FHd4[$%A:`i/m 'W-Ew6ߓu<*#\ZtZVRQ*ko.r8⾛ ꨥƹ+1DzzD ;kH`w>Ӓ)E@ՄfL=:Z@,QsAQ/no==vYR(0TY<&8 ݆`Bޙ$ oPhNgpJW?}7/m^$9<8&hcoj4UղZ-Q-k3ڵJ׊ F't- V:eM v<J*¦e▱م h]<Y,f!Ynu '1beP]i]ft `%qyå ЫXL8P=)/FY^1 "Z+IouftYhewW#vSE0lw-v[HV;G4"Z"$> PKcv--^aKI CKBQY2dC&2&b e gL`)qiDLlqhC*"mc-B˔ȊZiR'd!JhDDq`"l480bEcD2Y+P/5eDDL ` Xy$RDm -Gָ wG@n[I~- 9}ܿ\s FU6 }GJm 6Knni]mrqlݎ ةLHx 6]FiM@*˽޵v+S٩Ϲh]Pҟ "1K / 8pѤ;3N0e|{3n~k[b-1wyq>ݬT6fވ^VJ0 nnڥ'.@V sVl'ǠvuaDZ0WBjrC2[l pbYZCWg&(^(6x4Q`AmWb'ǀI66VN!iK4]z8<(\܊ tOkM:b?ݾnD{)m {{W܏+^ h=wZ!wIbef9p ī5U-&1~;-gLQ))ٓx-[WS;ͪJ :фvj_Oվ^Q^E՗}noG 9N<*\Ã#2!-B2210 b<82,Bj',6.5LÍ'U_u{~56=:yujP{kTq=Q?s7jg8jR2pL/ Xi̬ o$#MשPNXK7 렉@ܢ"J1-Pcq$$6IV!Lc]u'4ߕ󙹙m^kwc_.C\-DGqKy͘e=B`iH*ѧ^{16J)v=O]%p8vG_޵0EˮvE|PըwXUj'h5"aWAq+"-z` ԟ^=};?~|ә}\lqZ+ͷM5,L96%Tlcg J[6 ,0S.OI+'î Jjz X+ʘ@U˓aW \O]%h5>vvPkˮ^7O$;!vS&N]%p?vU@P]Fv qs|Ƿ%r>|~V >ũtsxLLBC4 TE)amH:T07=H#zE.!~0;9VR)7zTe,Ktƨ ,7woC^% JxDpi\`R"T*5Vr.ALzq ?!EW7g_SbEu)y3m~N[Ct_ĥg3psOp"/`Ԕ%TiyWUtjd lggY T AkMvU6Zmnڽ٪T T4>? GcZɑ_mCux|{ Gxfç$٤T'݉")$P$%Ff( 2e(]g%_#&sztj؆N5;/˫ Z~sO?O? _?}h u 1|yExpoD[vԋ ~_Isdϐ{b%$ 3x5VZ=1y й2rC0ڣ@puFWk#TV,hJ=\g=86@]a >4|eWͱwa󽡣o?)n/G)+_;tIC)|9=#}&6paD\6f2`)C|_7@^7}} cC^)v鏏 Aâ`v"g݁w]NVF"Vj?R!.:U^l19 YdsH! QHLBE4n˞ UX,14mr:sRE010=x`)j+Cj߀6֌\ޜ'KJƇ8 ^EVx8xwALë«Z8P4`xʓ) UFٱǕQVy=ƕk=uU˼^qQZ }.օ9v˖p\0)BU}V5|piy@`zd Je<2eokU|JZeaE]JK#:,Fg29r;YܴA+[Նc\712zڂec?-W?d|k"+wvK0̮׾Vb/-]/vݻ\]}Ҭ N΃8Yqgپݕ/vK_ '`Ժݼi:dz-޶B{gZvԲ[ݻ4zk󅱲CZn^dC˫]_wϼ˚'tKѓ=^%So`h]_6ݕ f nt5}\O5| P|-J<+Xm|Bwr7E;RMUɐ*wtS&'%Մ90Xg]BAp);QF|dq,k0y,M@1z-WsX&[𪖶 ^FA€ʐKe tiAe62>3WҮ6-5h7 cv?5wX郓čEH5<ڠsJ.*'Rd@k05"GWmM>#]2{K|LS|fDsuR4t=r(;b猂.X+0sjǦ/>{PqYzr U%QTves!S7>&sM~?_Ɠ)c`\C.,[>av:zau_Ԗ{JƭA@*aQ!yG'jx%j0.NJx;c E1pQ._.\ e5em^v th4=1umH@HKJސ l2JKsϿ9U(ם/R9UwC 0סBehmM WG:YI +epēD!&霹,ǸV9kd kZWYÔ{o;{}PYw+׏->aoP@PO 'nN %O]5Eǩ3"uLOI6:D2yg㝒yjg-vkj'9h|G'BvXØ*:##)42oin8}#W_.?qYwhkuz7ƒipUcbY\"z'i 0lֲ,5&{͸<Ze*nQע\սaO?.u趴 ᵫ6Kof4@Hkp'ÑVpѺM45wȑf,׆RLTmUItf9|g [T[흺ѹ^rzܗ0F_Y1{e_@hz}lVaѾ/l|{Q =}8.;%&w]{F\>KWWY|4cb  =]QOA{{u|&O)opt7tTDZlQa\%+qEvAM[s7M-3T#vhFg{_l-X[cO\W?"h!-,K_ LNFv]s 1/E=,xک me5IC_E_(Ƹav?8}ry6CCe^WGhb(PRqY.4 "c$ *G\U*Etrk+^>E!!F!;T6r gBMi MWrhck#9_O>W%os->/ǃ]n {ꏊg˥=W;@q(3Ƭ%c)pY*^kcmhkrrFژB%Փ V=S97$\KϹII֌Ն]3nF)хVsu mӅ{ϨV 퍃MA~YCS"DZ-/VgxC^mʹ{sFV;|-ejg#ݛK:˗Ƅ9 #@3 SǿJy5Ó# ;z}@-F/Զϛ> |0 ~o܎`IwNo=SC?lG{\~*nEFK"L QZ68PĹ 9cr0zv9g r !$]IhtWmXE/˫û1n6BMum(u~Y=?,(?T5:>Zo5W7iFalAqxz=Ǥ z*vƟ^ I%9!u3OF]%rQWZ]]F]}7ꊮ9kzpu!r`uVϩ%DF]E%]C]F]m1`=a;Ego LױL~|sXE2@Vno.Xm79r7f  Ƙcٰ߉ky1x^@U`kL ϝǮKRm^{8m+0 Nϖz^i/9vf1|%gS4amżZyW9݆M næ+ v I)tpWۊ7% oV|1R+eR`SEYdw1 2D؞yrEV4)n4avoV,ENka DRB,D!Yn5a-+`Ll Xby2&v"MDBMlVQcv2 dP9vuT-QW/G]Q%mWWTIɨ+ kt* *`ziVSW`q:*+ũD>ztTruR`m.Ĕ?~?8t,N^ɶB!T39DD; FS  XMDȐƆq64X E&h%O쵴^~#V.;s:s$_wmޙ[/$UlMSC)1>U6B% 7 ŬYNl[hͶg/NlrE$Cin5A˂Ak%lK]գ6rVt1h cQÞDIeI5SSx._f,b OV]_T|l.W#35p!0S1qHxfUUHO:<\ac)Th7^\"R")9IE$s2JaQiƝVc  B' L<([Ws͏Sdɂ4:#[>$Y%SaG1AA$q͕c")z}mMv ߜzU(2'yV K8JT-IAW6V7F8_bY'zv ~Džs y[#W Xsɕ<"a}$$bݦFIƅ/f迳.M{ʃ$s~~B-'"5gL& ]XbX1S`)"Z J1uf0wx>;ʿ@0ଘ* K ɥE֭\ vW{}(tK;Og3QI\܆ֽIJ˟rMtyy1~nPŵbZeBB}&_MBԚSѺ5j܌ccCxv- +4Y;ݍhm픣X0OstNj 0#U CrifC(Xi'u;B&cerTZNjʹtJIs`yWl%>ǭKJS.&[bqvD֯_tۏe':MvM t%u-_ b~7\ENxS|B`̀@_8CNW:S{4&M=z1J_'4)O ?{WUc?Df gexPQC{Njco7߀$41=VF `d3i- ;HN J1)scB{XuQK.T9B ˅VG=FHd4[$%A:ij9'6*$f"QuMVsN7`a[i)'Ī0D'j5?0`9BM,Ƙ~=CJ{bAƕ2ί"lb" e2**$̼\p#v=q |F' m6P|!bAGN)+FfJ UY5&U1F: lp6HF_jRz\TknXek6j#gBN ?8{.9}TD'/wUnOn8as1Ŝrip"gkH4f NpԀÂ#j1Q2mpFj}6X՞ U" r d =!"}JzX{))qJ7DB0g;Ja8R38~B.I"9Ks샪]m>ws0Vs۳47oޚ˄WhblX2N!AJW!2A͞5xyݷh] 6@GڡOEmlw(HLEƔbP ..5xI^$/>A^ 7TD4C)VQ B,.8yBE4D(HPԭZ0*EY)aʽ3ø)af(X9+w# ϘӯGC?ZmWҤGY3rMJ>w7_&$U~}A?*]2hw}\wan08fC߅sc LaSGaf% E>Ӓ?{WFl/;PUYFXƾ섣>b7%a 6VSTUfbS,HSj.[v!l_ʣtzr8N >%\ 2ܠ"Rqoxr1L-<TO<xr7͝I~٫>N.GKL[~p.\a>unY>kcfuOeiFqSiO6CPL@6@x\k"*6h 44Si~_RTQG-t]Zs,gYnד[+~vtO=Ӳ L KJLVᶇT+y3NZiل!?dg{Q3X"jjgsЁH%^6Yb# :PSV! 6^$*xױv^vH }6h5A5 2ѓ(aԲ` ֳ: cA.iUiԌϠP@S@fL9+i?A>'QXb܌e(JD$9bQ!XU˱+tә9cFBpJl⶗(饹ax`֓$wGO17D1 gYgM6Ѩ55a tK*g{.l^/"͡4m_6sl,:,26$4uK|ڈnK~ȅh%,&R@*Q*AV rѤ@ѻԅ/Fw-K."ٌ!)b"v&{5W]I^5nSW+STWGj  6vp8s 6 WVJz p=\ꥴN+6U5̮U=pJ)2m?\$U5UvSbqV: (DKb v;Wl.Y%m+p NO+6%ʈvWBaWlTGY s=\yJ\JjjvV~MvU_B]ǫQt_?uGkO|<fx\ƂY5B-\DC|㲳.L||'5JFA͘RZe:.j'%j)K.e- օdwFu O h2ӌgKxFoB5vǍݍGv)a9Ol+DuAQ-,MQ)]$_<,H)(* Ov֐쾼a=o̽O! u fo6zRذ<",{Pл}`?7yڬ@Xdcy3c֒(b)HA"9TC7HB{_ :xOѦP6(:&DNik% R)-wBp1xD٤4YLHdI$Rי:3+~5Vgņ6,%-'Yͨقr}3<9_Zn_{}ˇd$ň(s f1X)A4F!S^%W) 7>u[_byrv|+eMVj(S-IX, )@t¸ǎ֡q(Ott7C?/-^XI1Hm!ƫ$JVx4h/U͍nDԤ]DSRq1+_jjz@rxv!Q1bHhS 7WiK@Z'cgFG=\φypF[q'?r.iԢ\+"|prٮ39ˌ$4 1~u f<ϻK.+"}ZvvFa]^l.dӥ57]qOmxS}ѣ! a/#:8ϑ'&'' _?Mino48|HӦ˧v򑗴{whmJlQ[_ƱR}nkYUQ%]Giff0?jqz֪ܶ$U8CG \wn8Yne'd*jNJoiYa`w;Ӳ y]FH"EA: @GKg~|X8̖-r*Aɡnb&k4ц9mjlFJ!WGfܴCmZ!)zmyQt)s-6.*̥h%6*V')PYISAX+dQ97:لm)љOUXyz;kݭ-ѫ%In=tM~)VGzy+GCw/x(֓3]u ڿA]h9FE˒4@oANÁOx:e:,dA' e9PDZ/:-"e,9BAoLrָN9j{pN(0$G^٢b JF)b,$AZf>1G̜G+ImX^/ Y K>KǥDKútrfd x) ?m{GTrF`|m052C@H,3MQO6K HM"yu"3|J9"] &rQt[KZ,`bJhsI&%)egQ2+_D=.5AǮ+vuf1w1 M}3I"w|Ҝf>_5@!kɒ]P+%S(y$1x YM?x;SJiZѰrk/%pn*g͆ X FT;RVJJzpP 8:=X2nGI94nc$(A/B|P(w0Q^>AjBWָK{H(ch( $`f7d%_L hu,ٳW JIU(E|YGZ͘=\ V(w|Y /{"1"l'd+i~;Ϧf I$j0:]Uiy&śdrzЯ'ݗ4wۜpk++w)[*kkn嗬k5*pHxfAmHHJ*˲ ߇> %xm "}ʄY,1<&]:4nVmxd} GlAi<L Q-%r4*ͺm7ی?9*MǺ/԰^(_vD{9Zլ{$fTT=Շiibr>n@SO3`mT:S;ℳù@Sz4U&֒lҊY"R"W.Irk_ݺk5^w8{^S _C/ѡm椔ht%vݹ{n|g/LfFN79eP Ejg\.C)wE>/Gʅ\ :sb]T6Ѕ Bd Aؐ(ғ(,l{S -1Ԣ#:9Ҡlu/+fQi?7Q}3il 6q)΂N;>v Go@f0@N "F[rdHlѱ} `l*Y38g߿O8} =È.'-&] e&X$]f5ƂH7q|[1@ԑ3EDŒx36}IJ7e~bQY߸@J+y$R-NuਈNk:d;t5E/)*74O.j\XVtىĴ@ 'N&OPf) ZyZ0XNzov>(S/-FCJCyhDS2JQPL(B\A$)a>lH } a䍳eIf(AʼKN$5.imT _@YkYw]}%?()S'i˾ZtF's@a vg_fD.1Ne';Rv(HvF R1DHV[r !8_!RP uJP$r:1tPxnp݇pVe=ܬ̩$juab2(CPN*y ްl( + 餌JPȤI![54r ,Cg6 &Gt( 5K_w-v=vEnͩҮ\uMV(#AC 16ASGS_"0to})Zzoe,9UKNg%Y+Y+|V!bqeO,dX8>>2l+6]~?i1x=B²Y$ њL+!|҆1狪r7%B"-/x :KY*)T%tT.r$'GacB É`>}c@GS9cΤ>^9gz^[Q؞t}j.ځ+#OvEDS-RRc>aɢ|tNe@63\cbG=511Y=eu*f^<γBJT0k"Id}D7 Olj|f p|80rw!j_.)$XCE4Tp(3YL2gi?fd}XkCjf?ZJuHQkewwp.qN6K-@.M6EA:[%#I| .C"KdB!Ѡ,H0a1jo¶ v|e1J꘮9Ş {m'my=?g,Ӭ`y㞻,പ:fv*Cٰ]X 3YǪ`XJjwN[2xN_|.|GүU.tRD5 lX3ORJd`gd' 4lt ;@.bl@Ed# W@'dR椤, b:<6W-04gSyb*/ 8mH f]u(/%WzZ|ABd&qCF΃H2} I$Gj Aȁ<LIYa(/rd!XRG#0'YKkE8߶[FA{W1uŪ~9;eP%X"I'#0*1@1IpY;$]" l&)->kDJ K9L*Q+Yw'y[HM2ޔ52<]1()K Ue3: VJU[4^{  *S}ޫOcaΧqqLyJ|lQz (r_;pQ: $/lt%KdNj%ѫBHE& KSk֝K4WXtYuDz,{h;Zs;H;&7}idG~6&MZ"\N63okοӃr~h1šX9)4l-Hd 0e%#M[.POJV6u@r+,3Z1Ȗ#vlG0{hl+8ԦJBf"؈t6ڱZU棒^*inR6A+.: &F,*GcM"ZFđ1C*&Ȣ:;Ywa/WI3шlo@Y| ^S>Cdg&+X2h/׎(vQ02x+kl@K*I\M8Ryt:c8侇Yw#S tJ1.\dw|XI[+9 (lu'+@S^ c4:{ .YDZz¶dZdHȭ'ݸ{v{@ۆȴ&0ND~tRY4,s/_x1LggM=_B?bg)tVAؑW(&hJu_+YѳtQ\ιWyُKD0YLG@#Iiס_η8zt ӡ$>96Vt_z݇]do]!Mu +\~ë].˥/ԔP"G{~ow~K,"-5WkŹe㎓xBPvB'̫r()Q5O𿋹'dU] :u=WOv:*Ff0FL ,jҖ` !c$fuF,Mlgf?d޳wG+`r +G?hH=qμu5"Q11*m{xn=@ߛ<ޒia8r(E-&EqWDa2Neqm_!4w7Ç-2z5v#ퟜ'_\~ÿ>2o%!@$IԷH+"Q߮Bǻ܂}~B]G1[ڃٶ^cmOChe[xiߣnMC\cXW|G"F*bNt*G|)B,77WK/I!R=JwRkø.26I:4nggplҖ]Z݆'X 泣-8zJY(Hթ Did|pj hS1㧑-hoFm"$B7̐ /{V_]o9W|]` 8ff/LnO[,y$;s}neebY|TWbxdk"EiFi P# )H;mV{br臭vM^%_ WK%_ W%_ W%_%_ WE"X%_E.%_ W83^>G)Ũ,%` Xp%` Xb&mZbYb%F@Pb%F@PbCJA%I5 6̝Z 2lEbSV6QŠV,㢻Sy1h RH c\P3E`.eqڪAro̰t)a1I.r@mS חW- [ϲ n!0~>}!i)] QZ_XfnxoALOWPl:}e`MZ%aZn4N{S"16IS7~'MH ~e|8u;fEFO?ٰ|ך )nzTQ77u}mmo]Bߌ} S# >b/e7;w<=jݎn3R>IDO] jb<m*gH|EX(J$.ZuѪV}Z5 O6Z`G#6 ֒" F!T2#AΫUX{2xg~=aOu&Žf喙م:ő9I$su2Io\uŤv@*(]At5-zJge6[X}y4 d%KWixViе#ɪq}s@c_&najihcg7Ci7KP4!`Y^r쬭" bJK*1dbixY~j,iN_.Tk)R?[IqMۻEi{ocP G06UdbI)jKSXrm7f]͠emګTi޿I J؝PҢr&x"ywL$oSE&;'\w[KVo ëV"E=IIdgO:P#UTR塲 ##1 #i^SrA;ҷG =Zy${1yaxtad[]N("qbPl, )S M,Vqԃ(h'M) CYrE؄ Ad: ViǬQH<a8z|qkΏwk&_wǰ( :TR';\9wob#Ez]cZj47Ϊ&ҵĒRϤjs|𘪈Q@3(8Ā}Y@ kTnp h<HD)TD2'Tfi%1#Aycҕ0lTt= YuVI1$0^x* ("(p7<`9rLD_,Е~iFRR.hS bZF$"8 x)PX7`KpҠHyip)|wg_ O'ֆH{?SS hsA3YD{f@BkT(1ʸ3{]KE?_ȣ}(ɫp+P"a:9RL݉L¥ܞLA^wo%P0 VVLjqkBΥCRR:r0 }wVˆgiuyd3Lo>mԂ%7_M}$zzyQŵz0/Mc 18q{EJj7>Cz0ԧtoZ<9oŸwo/obBq4_8_Lǖ.` Cz0 t%;[b|}K]͐f8lfqFa8ehz’vppdfs\uJVNvW09kuC#aoKUL@~^3>'7ߑcR6T*`MϴpWϟߟw??O|w0Qgoaxh~J=;M_oZ\MC{mTm9{ 2rvݻ|Kھ5Q͗ĥmڧ>wM }Ϸ_/I5UTm*)Q.G 0n@.\=$M]JQGZF?Ir%8Ub1f4 >G Fl@} vwϺ%AF=iR;_:c3y4+Pm.I]ȗQle,V΀lPqxj.qv;xpGlQlMJc6a( >FϐKҤAGN <^+)TV5*Cz ^ wE%K@)*'e#g@z!Y R i#8.:Ǥ Yv-:7ǂa:ZӵOO (D(J! ,% `pZuL:;|GkEQ\EL(Hp(@ƒE*L< SDcK)QƜ(ar@FdxBR?uM~R jDO6̄Jw _LǯxM2!TX1Q6,JxWjߨYMYM>"J 3T HJC0FhH#͛2b6\ dG3>̈tLAn6;Lrq<8Yo&y1fp6Io!Ȃ )[lkd$- |щD؋DHkYE0!&XRG-H%&F˝(XX(ҶVd9ZoI#t9M yw]&n}mi_:(Us;j9 2pKtN`t 1!*+QD h,8"ŗP,#xF _DO)IJ;t!04)]dHRMSZhP"d<3` Һ?_<%/:qĒshi .:H2- [FΰT aƥ#bYEn?r#A(f׷?Lx)O [\t1.v(qV0 KGFm GG.Bϋ\Io狔HkM1W';\K^ORSW 6# Zv(6Sr]W֣Ir~e6u5_3{*2~;mjLq2QvCC-hx H""*rW| 80N$1Pș@UyD kP'\TĞRedQqkr532P|縳pa pʴ^(p Ȩg4qȐfzBQ]<* OSegyMe˼:S>S5:Ҹ `3ܛp9V8pK+me`<@FM6@{4).&0BdIH^Z俿-ْ%[xR@fFG,J9D4q*g Xi̬yë88iO7N /H{BzP0-PrӂY 5V1LGy8i^?3Y>~󊦯^/hx:yKd}B%ؒ"@;tt^3fs5X-JxP=VYE 62(Q9bD5T $kܯ oaϤ^o>W[j6#\SqEY(\ dw2 7)惌 V;kPbt#J9z G ݗXg =L$ZCzKLt\ %mZQY$7T( F#ivx9cG9[z I*`T8M%#,8X*˂1WN的 D d04 ܫǭăA6gM+]3k] 0t4lwREQ+gV0;l1t?a d&놋 CQl>-XoopTdQL-%$@̋$ޛ}9Nv߽yy NyMs uRy4Ah&Jm$ GiCs^'L0%W܂r'5L3D msjQ*[Ka~5q4cvޘ܅sc $q8w&d± lB&(578w-: NAm4ʔf1+2ldLrJ)XEyZiw1r܌#2ڝ cg-k 9Zr;AfS\vomkmpآy^Z{,aMvFfo![6[cR;ĬO+GjdrVk/;^sXef`E x*F_-p;ifCx.`8|(sd,5trsۛeC?߻_[uAR󆜭]=ya@cS nL|H6p+J /:[~{`_ T%8{?2)mҀ;JE4ہK5$lG]4~ԷG-ٯVm]xYzYw]tp_ΠfNW6֎<-&Bi=$龱׵J)`NkQ*8ʝO% DL8P=t:FY^1Tk%i􍯵_}n򃚖ۤ'8)͑ug+M-sŽt[յ[7ގ\=N9R5:=f9=.''rOjO(sIv &5XOj6}bz>tEBֈOj6v*T ]=Gb]%5D.ttP23+.u´6t^JhJNzt%HԈ`kCW uV}R!]I5X'Lp ]G]7І!])I[?]%>W&ծ$EH;]%#]i15 [tp BuwP67DWbš-$`]XzpՎ]V"Z/kЕhC1䖳0Wpu?z՛7x._Z,^YySK|y燃Ri]M946ZuЄ}ЄR49(E\>L ]%DօZJ.ztEԈSʴ-Jp) ]%;]%R6t 銥x cjCW`^JhwJ(9kT Y#U[JhJFzt%t:Ft3^Jp9 ]%bォ %]I]`]%LhwJ(5iҕFDI\#U[Jh{W e+B/_ ;~3h=jG B)ўyWr ]=t1b.f#7lٞv+roQ ͉k v͝zcj 9^{k?䤞ګ(KьrQ_k ?`W1:qEas`MD*&H-Dmf8YgO*n@`$Q eBI(NhP*$S)H +kCW X-NW h2ϑ( ɻJwzpUm+@{WRf ]= ]1)]%Ն RNW`o{5t$t #׈0E\BBW -#NW 徽ЕNm8n8 >Antyqx=;m9Dˋ_9x 96MFkMrwߥ1:08D-n9 U8 mWMI_Ⱦ{NI_3 id\~"NxaYX(`,qqQ+˒cLWC_Mx+;N[M;aLF&XOѣ1ߛ67fh=@0Rɛ9r˒zaz` ^]\Rm |%K7Qߛa; fGQۗ_҇TjHA@rxm&Hg Hs\uI34gy1Y ɠ׻Nּav%7M0/އD3Q)m9w](t5ќ=<Ӕ"hOQY9)]t~gMeR'{e?n=n}nDx?eAJ~A;",-fZYS`&K+[cH|th}VؿFU`S܇Þ笅r tSC8B:O0!S&J;f2 | \KLR5lL;"˸xTdc^Ϋb=!R1Tp;h*?KG1^gt~IuW,]x򬜛~Bv9V bʹ}{PQpa0\R|uSɰ̕hF`|gE~zI,)LzM)`5B94[ReL[e(*VA1$0!JSaG1AA, q͕c"Fw"iyi#OeVQ$%5X cN*!" ĴIEp#R 7LJnKpҠFHgӍ4mV@8//N3zHQ<>넬ggϧ+[> ̎?NI>C4g,0e'a.ʌ40L}M2ָ/顤 E߸QyDivezLhT݂'^"9LfPvլQ0EG)(ԵLQ3۹jC V|ro%P0 $+85rTa1\Jǻ1ptHVC7') 8'>zqY/*뛒H&=rmR^DGGóaRNŊkSJOA5 il)08vZ9klxvn:L6Syhx:-Nh93nON}KuVE j_FȻ'OMwֹ} [VKkb|qMg˪!˫XXMUX>"=z1Y^s5Y{xWZZ+Aw:mZ4Lj- TGRޠ|fTt8R+X*T;   kwǷ38߾~}ǘ~}W`u=0  Ck)16nU_VZmU MRf'~ m6yIg2GR;R(2|oڅ׽~NMή&Dt%OA3"s"JT?D1e3T̗x#,/\(W͹ci 1Ҽ7 N*'JVtH´sM-Sy/ƌ4:e 'S3g (nw񲉱"gexPQC{Njdi6߀wP$`$ʨ1\yLZ˂Î #7xQ£64ASgG>eH脽&DbyI $7=~o DP^ribY.:JF"͵4tD$TnٚL%FЊi55 Z{ k\j,^bZ}{HE+/Yg-`!asJ3(Rd^ptɃk#3Bx%ڪ5*Ct6W@Hm_jRz\Tkn/Xe'm ~MBQncNtwtwi8:l0KE޴4+_?jbtMȮWuЕ5_O,nCg[ khWޑfXڼ"p9buyE.{ABYكyE"qlg`nΆ..ߤ4 3 `H/[JL.0"7\6s%nm$P룵LxwmmJWyc^E488(l#-9q~|[Rږ~V5\\ LHj9ZL9/Ue[K4eq5pU-(D;(9YS4'Yuk~-'\S/h=.jG!7༙ɟ_Z̧\/ D ]r%`-8ct.ŠQ3 %+Wp]Vlݬ8NwO0Kcer+-`KAUs)fswAd`BChUwĤ"Фj~A8BƩV5w_RF'h+)!V*"f`nŰKɕD‹Xzb]uNk'K,ī:8UX"KF%5R51z171֜! Cri ivy+[^+yg/aS*U+mJ=> N畨7Aqظ/M!/}H*{ ԊSZl;>=qŒ>ݨIqҟ P&[!Bz4&JoR6GeKM‚zM-ZX%D]t-:Pƺ30 HΨT[spV;X#A>Ï{wGE1ӿm7ϙ>ֹN? f.+N:^|u[qggYj8kxHZCh;;iYnM=Vh !&d*!Qz04_8[.]5A4#hsImʙ|նf527>LS5 6 W a8l{/O)buF^?w,ؽk o=kFٴU`mYVjxKr g￝=lm^|9{rOsmPa[7`Ʒ~Xs\,vػ>[K!}~1%u|umyϳ_w$׷}ꡉ6FXqH缤Hr6[thr+isrKLnin9 7!KE̮Y9@2Pk1պL%7H {,m-JMDRhI\(`|,l 4!*-L66"mFBV _ɂ a0r}[t UQgݭ9*z&c:#OQ;Dփ؁.ڇ%GTU琰Bّ)@``xTifά5¸oSl=º':+$-v!\yZDfon3: ry1ow:83XaFMۦ=!Oku(/B1W>_ubrRh#7jrqZ|5ah?\K9MW['Wy7;4_\GLkc_->I L{wY[vӡ0͋qgs*(+Q*@d\[&cFOU 8Ĺ-kNO<}G;ͷ._˚X^MJf7FNz%eŞCyP[Ѽ<޽;~,jt*fI2ߝ-D$OVN2x,="ѵB`0P!(҃Ugu\*H)X#!$_Pb*>٤L*2x< S"%S ʪV$?!O OON8yexdF[sB>(vCѶq (hL ġ})R[ٝ Όxx#; Mȁmht"-"SVEkKr+qH.I>Nkr*|R}vA@gۭ܏%;0?,N.>e>)-/f'_ .ruRmܔgVͻr\,7o J ~Dg׉7(|8ymk4?>ws{m hp,_y6r!4xЅփ&Dxupp<}p8:xX7o1S+\"p 4[_NFro=ҟy]vbak^0dY$zE=A1f@3 &]c=kΡgfcZ783/TvS/o*ߗ/IDRȃY :{&lu]=x=-wz._q1=񙛴-x]nTH”7|ZlHtmŦ;gb&~][#ziYQD /OUb7Ub&6E1'hPȿx U>rqtS/qWrJS&,PD=a WT\P뒍T֡'6(Ӛlk=0ᨪuuOu?׫uZF}߆NaįFS/Wۂr])BRԠ  H90xHCR)֔H`Q+}ϜTBB 32u@5[MNV7%-mb&PN/Ǟ={Suso+/`GhZ *$^'aTS&sY3|^W j01D&,5bC!80(ͭ!PZAU#[sGy{}X\֑cov|M)Xz+)K"ɕ?͟Fl9ɯ|O=fV2TT6E/`vz'tpWJz_kK!EY!\ھ7_s1VNCUD@OI!'פhD}P+AѲ HFnُJ7,3 Äq]1rڵGU<7' <-fwβ EHI[Dkj*i)T&(hU2΄jt 2mWĶVglƚ}j;`_E)aNh TJ>tFnُq1w$0^XP&Ԟ_MV( Hg/gUZ0ʕ`! osi9haQ%krQ M1!G(c.$x'ܭ9p-0 "vODD"ބ"$ p$]\6`muV9l2l=ə9vd`PAqq`5Uf߲ S"D{[BJ x^KcΈح9ȫcZK3. '\q l`TIUM_eX  ТP4&\|\<<; lOM w5Mle7ȵq:ޏAMޏy:YEKd8 ;:Z%qrxK7k9YnMS;ws]3⣋JBv"Ɛ5ٳQ:0$ ʱiA'A\E_[&"%BiQ蕭Ftk~,=Cw=רw8KS Nk(+QC7szas35 GI:&q6U/?{F_vضbe 8d3`g936bKN,bK-ɲՖ8YdWEփ67njy}F/3- }k: "c@z9L둋\~i=j5ZjQigT@rQS ž "gy)Qqޟ'>e~HjGIoq=x&A==Bmynhټ{_dqMNږoh^T76;h|iޛB^]Ql=j; _dȝ{W3ao.k[4X1k3 =To"RCeb֕=Co_CDs(`DJ4\J|$-,zgqy%1d¨,d3JqӪ92-ȉ,຤ &imP!@M6)iO9+ZYQjY-| *UD6)@ndR!GPdəEP$B^NԢsiFu'm'z=_9 ,$,"ZP^ώ0U2Gkr!*A֠V8N v'Va\U9rm $8 Ì^, sT uƖ^ت"ģiIg6iG+~'i>D (lrNpc!2e ZG'c Pouމv9LUZD/zkv">hY/ă r`s2ӌ4.uL絒ҩ.plA(``ЉG{1c[߫5ytCCQX;]BMɴfLKz@5 o3IzO7frҤiIu:(&ڊe+L+iiKZf0ADCj.c)Ev1]ޗeq8ݢԻQ_={{]_ŨluV @:#^?]*ЃMc&vJ_m7OA|^*%HR"f d0J5CrDT685Ɏs_ H v&0Vh(΄r%ەBm)T݅|7=fWo˟]]=`3yèyՃK]+өǾz!uUvgU!wZꊨn*T TW@%tEsP; *vuET TWֻJ*fvF]ruU|UY^"pe.+ȝQW\wE]j֣BRzJJ!uU\vF]j֟]*Aw%+fcyP_O"2zn8*lIkY 术l.GkS{|2>W2,2he4*{ ixi5>ets=a_/y~y'DnO9[eĮh>fzp`lVjeDp8J2h 4R>kqJkKx^th6đ#ф+0gtPUdt1 J˨+YUh|vu|xEtW!R+ZeNlEq1Y<V:D3r@Jhr1nL ZM 6p܃Y#3ʡSVKQvѴYzrg |n{<4§/EŊCׯp4`fe$y/070^\Z3-TpvT"42Uq 5v0LRcЊ* *gQy! 8Z0FM'gLjiTKI oxI r!\] gW%f05rV k~}Z/ ޛ-[Yd.zQ2չ+@hyץOfOdO-)*j]Qzu5n{ZNm$t5eE- O]Wyr~Kk.%gs^ZUt<_yWqtp9Ӿ}OZAm>؃s>Uk>N;/\A zxK( kc9$wshAHRkN=`Q+N6wB&Hi[Q[eN/9+1L|j%H5x\]k!"hO\&$*!0PBJ1'QʧA|*^r@35pv+T UAgm8:Ƣ h5IDr$I Y$@{%p+{< UnR?]:aU?R-4:=ZDrYNIthAłT9v,?ꇎ75TM[o6+ohQY1:U)ȴ5$KIJL-)h:AjWwx-y?Hb~{M4~ ~܉LG'q|>5Wˉt| iyr]Ta~<,g0޿n4O곧Lk~E.d9熌r8ŀ3.쟳Z EwQ:W RgVgͿF8+ƒw^:0m 1WWhANQ}xT(knՇ2p{O_ꓺOwjE2Y%ZWojMtx8\|TCXupq:+!"DS?.i8-$ fPj?iҺ7n^'//'gw zϧ'k+lm^^WtqOSMoH esyfyOSP(Ze'O<9;X69i l$p$az:\dFF篃a(]_H^9?7tcS͖t'gINΏ/޾GwG߾?`wiE 1/$AM?"55kWZ9ɂ?.US^0mj%I ( 㫦O}S.fG5 ׏|&1?ԝzaEM_oRe)U,*xXt.ƭ^Ϊo#]GK.H~8U r!+o(OG.~衤Xq2"#d~\xX?ts~F]JɗKHJ-/ex6,@mc36}:-sTR򂻜̲MQK&%0/f-#g%㤳F֌'P .ZڪwRp1WQ;_Ƥz+ɐ_0y)pJ_6TR~-8` M1WS!2Yd\J42iy*uc"j+0g@>%e9ʺ/ucY(A}1ϝ$&K<^)VUwNy`裱&'@ ڨ" ٨hPƷZ#g`BVpk {X^6M{㗳<[l7_lءBGJl̺`IZhFǵ2j-l{AR.K8錹k{V|r)K+T q\8M#`;[זK5ZǓ>@%$ ɳ̄4 eT`sr\c*Ĉ:eyf-FHR\f%(KL1 в-9rm X'UTQsoDn_p٦M|&-fLxoV JXQ1%P @EL& o+;16l3~= 2y5{P|o>X~<;̩nwIUolm/4O'-//_NX|a 9Ygu"!+a9 [uI՗Wn< (YIC␏*M҂J ;8(f=4zq\k79 s=W}~xG뱃#oEvCt__g &&dV0X $ISg>XFi1=*a=txj't_~}p %^{m׳OpƩM)};ЊH_ T-?}V ?V;p6` M#.ے00jŦ¦!Q:;dF/ZmlΞ5wͧCZv`>m%yZ>7iuNw3zжM4^ɺ_XyGӦc?xtWW]xdWu 7hunc0>'X[CT: _\h-Wx]¸#-V UCWE~^݀x R@ !}bu l2xv,deV;.RY^ k7oo7NWR ^3qڄ)Vu5›-2X_PJ eAҦV[RzM@z`*ݠz5zO$z~Dh Uo33[VꘅJY`n1KFVC9ٴs˼A^w. ]cZ詌yc~@g.djij=y0BGQthdP!:*iaƬ~FX./Swv[1zۗrn/W,碿/|r v>PX$/L,|I#F4B m%e%׾m _^b ) N_"[\W-:8whP:k6.};v5Vڢ#7h^Pn|%eZ)ˇxBu>ϕu--V2tvӶ1()ia SRԮ@Xs'0 sPI R/795:K[t9JG1pI w:W)u)G#.B@de>Hc6 Azlzp-_=ގdx~zVwNM{]$eQzp1[|Ιbv3Oim "KIiR/[-S?PyK 3Ʀ8yؖ"ܥ'W]M!ef5LN?r[031圔XoH Jl*'Lřu6FfI>F*nçtv{I=dV?}5W'/ zoMqp濦/ X?{Zڟ+~ 1w٦G>>fռ_r2i떫:f|P;: ޙF5U:(3Z֛%FW^eWl/;WH;C»ߟ-ץ)?y W ^&8<>Zo"|Ͼ2ߚ3l5;'U$^Xt=x馢ARMC.]JNhG6dߣtU( GQSnwQ#>Јd~bd; Wiq.fZ,~z돺%휎%E`..]#]w4Ą-u)c9:*ׁȢW\% QN".*)Oи3%bC.NzQSKDPDDSWl46i'9{n C{ l]I>~MzO(t$[HQR6Ftw#OR̓3&e]4P f/›( G@tqֲ)@0rXPQl¥Ɓ̜݁ld͙ҼV P67Cky^ϿU໏o{j{nrg~bF~T+7E=C.H%%錵n>`Zu׶m+gK%FQWsc|M!ʮ R TkdlfndlUaa38 }c,t#mʌ7\7H @ ]Lf_&W7Z&KIE5S%(Ӣ2;IAUsW@&"r ߆*Bb6VlYN0sdvMITD1b73g7b4ߑac(G6/>2@^fNH+je BE̤d]WV(ؐY!33̢()5D2@a"&Z7kfnPEhKmX""(s.B`cPufΌ=~#`*U}*/ u46xn̻|(&9`5sN"Hu.zԹ\lu(0:% Qph xNCtMڥJ CL&’ 蒼-F,BMz6dVK4h)E>k}aۛ^z!u~y~* VCRs#j%禮eϹD7mY3s>|@uRhJIAJxސIƐi/0XXC׽x$`f1BB`$8;6,u.H[J`EC9"PH4΂(A$J[#kP&%(ڡpYgL$%W㬙9{YW¬w6G/K"ZEң+ HhB!]27T *l$ -v ! WZ#xYڑLK`NE/E!V>`BʀƌTJדQ':6x]@H6}t"TJK!X@y[8m 6~Ǡ5owKw3r)ٓ6TTBPy¸ zsP>sSH\mhk"¶̉=4f.rhA[o #c~~SjcS)5Sq U] W\iO W޵u#"˶@MrMmbYr-9],_dK 9&r.J\}5pe\z{4Q[p8ry^zWR\Gjӥ %O/&AYw`nnX0DM-▎#?FT,V[Rgw\ bfr~6[KSkGp q80NCyמ/"]ovF.B0;YW*5 SS7kk߆O%G I9ZaY)e`ccmԒEhXlc trE^ܻE~E&P :bnjeAneYٵ<[F[l-# .>㨅7 ;Bsfl \ju*T"hV\jU!WU֪]+3 +>iWDloઐk澵,ppUD +iGpU6voȵ \jv 3_$\Yj˲6ywDž-aK|* (FX6d OJI4}ά̺$ѨW|4C~@mOw|lZ)8C{En~~~y\luOYgzK9Os\Wi^o޾{0m?i{ȶ3 8X×@S^rWiA J@;5& g':hQdsa,$nC9+Ycz7&5n4uf| 5%{'E‡c3ʹٸ7 ,Ϳl+>w*fyWz-ܜD)[TC֨Y#H,c17(`8/lPnKb]YQ®;7~}D8+]J8M# l4h<!3v] 3[GD(av͒#r$@dՙCe,^LSU#g, BLɦ5w5ԁkkR :ْ*ʣ)s#m4M*DmMDY" 6$ ϒbr"ГhP"kL,Y#q9lGt^r ˝lYO6ly]+>Yɰ<WAI˷:PwI{Զ2$='r~7Yx/YePr(u3s*2.Yc)p$W)Pcc`eLd`ԧ"M4 %W82TmdFXOW8cS,tXXmrQ49U~8Or&ɟи`iOrĖ$27ɔDfXHV5,Z),KHa@CUi`Q` ¨֙l;F@Ү!eyhL̝`kqǮP`,"czDHY)_KLIn2bdDRp>fY :iW^E5r֨}L 0 "V Q(Y"6xap 8n\ Ec4-L֮a nf0U.kUf9D"!ȒiN9#⧓DK Lv鬳l2.;\i63a*ɧ< ,sJ1Hf$ 2;HcOոcWPlOa /Ǔtkk𝭎k8uu;`YwޏP@c) 4I ?񴂫#HLy(gLKSv/qqu܈e#TJ24QT[L"jH!N4$RKԃCtX"Rz%472Re,pϭ1&dὦ/fE[Z<;B7z<7v6" +xGM Қ~ArNbiGb>IKUͮ:oWUvy]uޮ:oWw?L4Mb b3қoECĬ+zϫSk|rLnioHj֚xja1iS :r,r I?9X 38Gԍg렌ֈ<"^O Y!Hh9^,#}.Y#0tgίha~N+ }Fag-/i9#-Vt2M͘:o`eS\4 gB=0kcMt񔤖I2g1tq6"$$%j4w@e FiB+)wvFŽJcȤ]+MU9s,;hc5,ȉ,)btI%0Ik@ 4L6)V>s*Y5r֔V lF_BS%x@nd͐O\NlH7Db)ۤCLK u֝_ πz ~f/ ,$ P^ Tɔ9d7^s+K QRUM Ag'uvBxw;J1b9 ogB a˜:ao*!uʮ/8irq+5xb,^3fV*Z`T m}V pcEd8 .fBI[<ѷ -d" ,|ȶ5ăiZ'Fs909iFHuL+HNdiGx`S r_ jBJ]!XW pǓi?*QQj82ϲSMNBgut :A'ْ\0ZeT6"KZ y)D2B=m 7;RƵݥUdU^K+fC,EvZ9 oln-(|(B9_qn~I33,r D^W`̕μ~Z3 =qks2<TPS3Tddjq22EuکXR] dF}d(|8RV50FN0ԠiRg목]M{5_)o,l}.~_y/ib8͙+MV?A_!o htf Лb.7{jқhtO{ےWc^תzM\,Zt>*LTs QG ~Ϥœ'B7yv]P^*0>s!}{O}2Ҟ=eؒ-w 0&׸8E7d )&8;8 C8pq:˃@ l3{6nKr1\`,yh1ԁi&`ђ/iB=.JI!sbv~9{7LړwZ oZ$zsmQ#p2ޢ.54DƔZ xA 7jO0'ӹ{5KEj_/&m'nAI=U݈HefyOpb`ŲD' n^ Wr$az\d%?F4}1&ʽr"~xo4鸅cQiTO{X8} bǿwo]G so=:@J$k+ق[]V߼k0Nתgp$V5#Ֆ[[8y\:]ST0OliQ &Zq*TB!][o#q+>Łg՗K @>HqIX-Cj'Op(\ IË"fMLwUwWWOO)?6se4DzDSո ip8?@ ka_ b>N"E& ﵉:0Y)&u9;tT/b&& HbA2j̮6B+SR,M4S7ټaQi,&{%E F)rVViFKl@0ۢ7Bh4b 18YBK,|;@ψ6ϙMg_U7P"{1%7AVYB VEE: 6O'K;.JKnòvis\/gML8kI'ӽTg-1QьQHk4wz$P{5fs̏GW^]Ymj<勺Du]se~frbǿ]]zwчcEݑoGuu|arEPsOZی^_fбCztl0O;PـǶ;}E5EXa^OMZ]#^KMZ[gνH%ʡwXSġZLn10ʕLBbyekYKBR (+$lR:dq8ighْ#H r^ T` uޕ Itb"Wtm(P[(#(@.Zz>PE**,}6 BFolyjcB!Vl|}#kwnؗOݬ_߼F&o2zIYP1)T >+5f'_(+ 餌J@#j Lu\fxE3C5g9Knێ ӄڼ0]w]`&JՋET(0T]'Mq)OgSYh]y 33g?/g讳foC*%ƕU$d,(&tNx)ǻy /c{$zx}<`a kx3O5LMVEPB ӟ/)lJpPva0 '0 wTRxT<\dIN pW^tI"[ô^䑆_/ѿ%:' gd7W0e$|:vն[l حQ]rk]d ),'OI*%'K+3f,*DQ[[Oܬ*'i0=gx=Fߊ t?lUdšqu*M`%l"L2 KIg`4.IU2d2o!DrYJ'LP_#팔'D̲;lOr,dOy.>\]IW)r3ɟ僼S^j<2mj(M/KYoK58buD񏋿Qsr_?6aV8~7nuiE,Kt;J[HW[Iį+mc{ѩej*m!^ogC駟Wn᷸[, ok,~|wk6Ni(>MVP/m}xvw_NK=d8+öD!HĵK%"ݢ Y۝=/;/W7\v^xhǫۯIo˥mJw5`lL;JJ1ăUF[1n:=W\v{m0vu 3zIh!9a,B*Rui Ȯ%lipӒ|; ]֤k_mq4ӯ1WmS}ݬT_k̔Uh#'/hF4st6|MnZ|mx`覢ѼRj[mޏ+66iJt6LQGLo57Gh6"^q<<2!%zBQɌgKYpK )1ԍC3M݈u%R.(2afTVI.@V^F[j8R7g7Y~ -ȍ cǢ_XÔ%YENF&aXbBKʘ$$.k')I|{SS@QJdhb GKfR&ZxRoJT={Sg ~g}X{jא\q  \+fy"f F϶ Y&a4fu.u-eWxXPAP!OʸQRnZy8I׎tF\H׎tGJ6:;@uZ&<*Vu=q (>[HXJA[/2`Ր*0jz6 ѺDwC5\cSmcQ/9##QOUE*C=7R$|6(ݍo=t5ϒBKz. dє!H t I i(gAqʀq/Œ]N<@*WPda֚*h+ TPRTnQoG=;zaUPր6n]++z{;yekHDH ia_Np)`Ǔ& ,EkmP]wuՉ6TQWǓ{)$\ῌyN|NiSW?~V9OKcAyhff͚.#f5>bIG53|-8/X_~O]q_{nM1Ywۙ_5{ȏs/מ/{6_?\)_8ɪn?]݌7Ջ2d4OOP袰ٻ+˹O#._htQջu;zDp¾"sU+l1W^goj-q0Wo\I~&bWߴ{/yoNj^0)յt KBF-c\2W'81W}la. P* WT}Ȃo}B Ww~3{÷Jd bΩ"j! d01r]fĢ"$YL6KR2嬬d Ks] $0y{-z{#MC:P.69^l&8K(vQdwu3|lM ]Dr|Jb+N *kAi)3Ro_$& D>8sLobRڑй.evzX"U.dh$M H DQx X4Jdu 3bdHh?+.OC؅O<6n+_Y!uw f'«'/_UztuLU!յaBٻ6#WIrȈRf88ݳd!_bjH-)VS=3HCDcVTU?U]]%r E,r~}X=L=VJ dX#ZB%FE8!T@ B8-LU1HL451sD Dy &g y#R9&;Ӳ2peE*"vWw-zq?_=st>pB7Q4j㳢:(>?Mg>J:V;X&e+bIuyGg(O|OoO_2}/oN߿}p. YPwf;15Іu54kh3 ?.U0mK)/R80Ia4"$^ǁC\iL$P.8=` i3I@ 0E&PG"4*4MN6۸n}k`3X a\AI,ԱЭ#ϰ7!js [=RR?0c#S鱑{ld.{l|=6Z[78"mEV'0J@fV֏}",|? Ki> pjy-˜e)8kdalHEnS @8yਭӅU?Wzetm.x<{bku#!- !-Tt(>*_H)>OcKZXZD OL3Eb9̩{Pmz򀫛Gz XFsJJM$*@FLOD$4Bh7mXPu[jFۼ&}j Mpϓf^]?xWO*kPFuO9K鿶[z|W廖?ZDVpnD"!BD2ONT}t~]v.6lC?~Jsн_НG)ZZ+i^Ʉ*(Eo,Ibwmxᗗ1y}_xv;)vf`..@Go?pR'J,#.F(T0E2"܋EȲ\qozJH%0*gʣf23͒Qz6u#as7J?vۑ\CFxfQO-v'(3 xã5Rz1ʵj&m@=fϥޖKLgg7=<[0QyEwHPj.+ɒ#UoyDg? -Ǔz<\Z4Bj-L>f@b&0=R-^;#lsDJ_*:hٶ > uٍ~]3^ˆgoM/ԭ{A !YRFѲ)#2B""sNSaq,)!%I zv?FzCљ`S"%RE# DdBS|2t >Ҝ;,h[u <ËO~cƘFŎt+q6ƯUPf>L_NgU۝(<)EUDу3; u jȇ8)+gzc9Ou S78~ㆁ$u655@uЊt<@qdQx\'ڐWPy W׍!t!.Xo)$q9(ĩr!.jS60&ҪYa[~_qӬ-D"uE@QPފNhrr)BIT$;!xc_ۙ=v<_Mp͡n^/m>DC-d PE֯o.P:mR`V(VTSfƊANc6cѡ |I7/"չ= II$Y`Zl&i5ݭ"#7ko'w_2v{hl<11s:zbGuq^xd= > o{YQFE3 /*{=3bWb%1-"%on+D> h@;:Ku<Rm:)DƲȥg(Bh#3EEڪ~q% A;ʆ ;0<]RHkELp0ބ7kĥ4FiRx`Eo:1zY}XUd.gXenx:ąGogQ}G*aפ֟eҬrZ{ )h4FKYXPT ԗK7tP9$:cA`aG";JN#Ijuy "}Fk WN ;Wg6A׹I3!RJ#* 3@:&D Y,GThOR(qxvh(Xj#jkCvL)ƄܢR%Ol;5x+zU԰wsW_\qS,dzYni?趙#n!bEmCaP RTYNilv+/9;f@B8]+rL5ü!pcQI3MRS( nҬ/x?M=.9;wtѻ&.IVn:-H'qJs W z+F$eaXR+O(WN;Qi[KztvE?Zkw5z,KsӋ<˞boOg9*>`U@‹oy"UxkN?R;05Żr~}ݸ5<w⒑Lc.6KYepJe{ڃRA^tF4.0mZߺ55[-Hx$/ϋx+6^S}_"rqޯl`WUwLl2K`o BO4Hhx"I PӫْU4./WO$t# \?PߧDBw8 EF=Nj]D8"n)ogGRQ9Jt,IYtF΂$)X,ZRZR4ղ{"_ןg<{)!3wlRgmL"!ښ#>3 ] ^5Ef뀹q6%NT8RtI+p;ɲ8<u1u6ӒcqQ5E9ℋi+E-LlQI&g T}l1Yz ǂʹc,xxd3@؞ 9rjIlr5> )<[|׋\qѶ{VELԉbC9B؊]ٙ@*)YL)Q7+'g7gg2g)˯}x_ <=Wϩ?AG]i1Gz~~Y6c$^̣tV4Pshu~_Qs-Qnlj0 KWBA"eNTP&J%H&} 916S =ލ$'#נmdsh' B!u$y"Hu6HD / ; :5vDjY";僱܌My"G&DBH 1[vm.fM c5֑m .Zt@Ȃ YL!3*AJ9t3[s4\ 1E!rQ`2<_7WkEA>KlӘ\uuZⲟ/wT'Ż_Z PQ>i2g ِBq>BXT1gErYnw M }-)gMXw&+RLޫ)QE,μ(^t$$+r5fHh*A`Ztc;k&΁v3[w}DQCLk&A$°K /^+b_rbRҐ\q 4/EW@}N1HbE' eQx2eTBW쭏IՊחOxҝ:4@@,@%ewJ+QM+иPQuh˚eOU9 1M{-TJg&a&kk"²}^dK/z4eȱƧ:%4 5x4P$ŎV HI 5G;sFUku跸yY'/ ?[;#QP SZS6?kͩϩnN? 1L9![:=[UrcUݫ]+uv˪|{_n.7(ˬ^_z3udJ3: &M{-旗y^EMq̖s]}wR/?1COX"RqdE=)@Tf($( KPkٰ } 4??,Tj!Y_{B6M9zxquſ=b> o <|gK9cqZX_s<ޡC:O(?Ӄa?d/K_~R nϫ{7 )Sζ>Wn18v7P/C[1C}<|?k}b66Tzc VUY-P^YPbX^.[Nv`y"ydOvmdE%GA4 EZ5eDN'4eg⴨1u>^fI<mOV;s*U\5G!PDMYT0mXKIftx+2*IMX:f6.dRPdieؐ+ ]`:SԊQE56f7;\yuulb_I{Aj?M3N6gae/]?n?<7SUR~&gО뮳f[Y^]{|'~?4?_8Bkǰ3O5Zx%O0F86> )YJMBg)K%lAG"4LrRy]]*$F#kTuB _տYHaL@#B!+dO>Pbh7Qx|jD|[].0dMPTG$=Y!I`Ge}bÌE jk$#O Bτ< yLL!E_(jeg:T[ f`J4i/^&i pYý؉</'o.KG,ɡ"1Aa @V,m'of96mbf?Z8'w9EE/ Xw_jՀW'ǏW=1c* 6{O]V /j_ˉ6Ri-=;ty"I~_y֧wJ^u(=9=dtkw~v[3#۽FAcx.C#Z}v҂r=>~{H2 Z~x%Pv +߆<I0\cÌŷ؁|y_L즏ZHy7S@5C#ֲos;b`H+C!)rH5ΐ>m;fW9+ <dC_T~܂,6"gI:QؠIB/ 4%Ͷ˛0xD^c&=z*-9LڳQ(Zm&~ g_ujzwdk=|WL򅘥%%_'yw橴 !wƕBZ3uôciQ1NnG+gۥˮmox _.&|c<3FYZ2ϟgmǵ|ι+u!/˼!Xtɭ/<[jG\ίw!/^Ջ,:׆ґgṛ2tҏ9kwus[xZ,ذGzu륰>E'^|@]<{vyy~{zW7G:On^ux#DKSn \ z_Oh,'v9߄O_O2ZpFx٥ xunt3BOmnj1 [ЅD`".Q5NR8'Y5֤&Z!^&hSk5\,sZSzڳ&ه̹J[!)Ϡz"JؐRc#U/7˔m{r6)nk(VzEq|!ŹaEH$GTH]6uŖI Pϰ(Df݉ږ-TtێVd;~x2 fC<'rWf(w>|7'DL[܍ygo哖J?= XՍ%lw[힌iFB(^7IC,v|wk9("kK|gʮjwրZa8͜qW"X&ca<`7l`+vO.kZ/Z~w/O'pR rRfL.yI$Ek\N\ΖM: CiurvleC3`NuSB̹i&1 O RAdk@Q P4{jEW -sm-t]6wb( -XT+6Sk.kj Du :4sE_4(TqZ|<ED1D< _P] N8uPHG; ʚ=1wh "6ndSR \sKlKvHdDf݈ CG/bM~l+O AolجRS\| \|*x8-: >A#@ HZj䝳6s,صu{aXO~zzr a#MFÒ6^cpм#B:硏99SmAjg[%?*{tcՠIRfpCPxF"!.C0D*g6Kw97ښ`>g9wsg #Bڃxٞ盱fI9t X'z-@17{Na0s~@X+O? rG ,^q9hqk?c#`Ě6Rv yDۺ5r.2'LcI߯Σuz-eS,3 >Vڐ#1O!ͽu<#k9{7mqM{=XgEJ9&Ua*T4q|30 wilCeEÌaS/U$ý9Y#MV7%R#;BRFc\-x.n'wg<%Pk"el!a[%Ep=cH3Ԕ "ՠ9q4J%7{0kyٸl+pb~Xh|C" L˜dS N6d+j5+!nZt]d?n@.go޼?ZEJ͎|b5``R2ZrI nzJI!ܵ{]&T/Hf cwΏv0]O zyiÅi1U5gpGD[3'v;%Д3j6 C*8u6玡d}.1FV i=^R㚽w&>#FcSHQ a^7I&An_woO/XWpx)#3P2esT|# OB(tl?n`GVL˝Z bsm:hƊzY%^M}q6řx8MIfk|.@5#T~ɈT8ڜ`5d*|ԺXljh |t Eov 5k!(2^몱Yklw(!n"ڵYHjCuvilрz,M(V֌%R4CFm1 xF,WmBFpup;OmqDq aDYyL}!Vg_+e k!<-蒋FnuCɗf,qS`ƨ5P쌎U7(f'lRT2jM,.]@6/ mGmic@n4R4SKl(mefP57Jf AXWD@ i7 7!wLL \B1R(pPR 3JA4 >ʃAέ`Rcu>pC BР wf1%p4GQF[5xg'!X"D5{*))Emz FȼbNf|kpA!߱( RDA@Bi$!2Hb0-x_ZM챵]'2ЙIwAَź^̈KdqP ʼnbT@TEd-0!!(6y t&ؔ9qg^hKVׂG^$!ZpP Û7 . , ,}n9#똕Hs+kJRLe1'z;֍Sydl\ 2Q*,|1;n 3*]Z7h.H/;eV"-xQ< chx6cT& :Qi%;1] ld-ZaATD.EN""diF%\lZFe}cV{@#CSDCAsk@/hC*J]rV`뙺Pq1 lc٩UB2mIAA35fpNJ@A[^AWPXv׌AbQ@0b#hؑ(;m2QGVK;tq] F`0_i >[Mɂ2QUK[qUc8W5U@ƬL0ՉGCNK1톤g ؒ$US<rHVmq.с&%\&Hw*T`=@ǐ6C2]IUZzxbc0A=- Tܪ7İ 4E d7a+CkR햡k~U1; JI bX 0pdG `$qn,QC=x$Wu3Zg`:ɮT8&"eh pN 1OT:Sw kj+H͚3|v0ԂUv i2{'5e vݳ,9kw11 ƛTz;! ׍5 !MMZwrT!-D΀zBXn2 N F9@8N՞3vllOY re4a" K.,M˥EP$B]X4(q(&JkBFy]!D!'1tK_5BsϮ泳wQ0#{ź nJe3=e{EXvkGprǸI 7MolrJzb6{4Ry~|@ 43gZ{㸑_&m|HMd$]$f3#oUCm!k:MኬT`!4EWg=@v-f~meny%jz)QA ~yN9s;]=[L)$dGjarX{mtv6霼pzg-9B{sV02/&c)<29RO௃[>|yC՛D79r:^Z"q!sQ<\U=(GhK#ֳ49XN.ٻbSmM5x\ bv5jޚ9I3\#.lޛ >9s ]0u%DfpEi{+*4Vد;0 Ca;t1쯢ZZ >Uixisf1_V_ߢŒW[G[2*}ËI&.]ȵQ)`UMc̀Rtuyǻ뛇74gxY?ͦA;hݾu{嫛77> Vhya4io~u;o3sc88 ঝ%FOAH8m(އw9hluӻ*Fh`fY᪇[+PWyZ! m̓qB ü(sVNjz!ɚ ! Y/L[Fi#EmRe1FeG%T =? b/op#hM7bÓZEofsjek/y1Q'L9sͥ+H@X_璹s'{a}c3`AzKI4Y&6?ǨÏUYePk+k̿f% Ӥ.`Q%Kʔ ATK6Y?naeT"f &RmF݉ytxo?aMI/K|u|v{øp~s#&x?[-/['y~he@WevCz75'<!_̸׋[.+VWW7t<]Inx=A0[Lp%GυLGaGgoz6}9*=U^&u{wI݁(]|#RkM ݩjXJO1:M̎3hnwUc_.첒5fI_ܬw`vP,#SځY4ʘĨrIlV IK]LASj}!3zZ4z+3Rmsm(Xʥh6F_9CL<2k6 t!8O./v+׭n6&m-Eܵm[SWޓXwm=vW i/w$<5缷̴o5(*bTi}2)-$`IIT>8W)t UDfTSet#!+v~G(d DR3n|HGKKXyе,!75U? )'/82\|f=峇RdO ٧.2?W*d3tkء;qq$y#@iDi;Qi~_($k (E gŸ`9"y#Fb<LGXQߢĘ#CNԮpTaELr[sڐ^o9j͜`hO9̻x|2}84nop^OӌŻϺ-f"%,:;9 EYѰF&ec2X‚Ct64y[3 hfo w[6V4T Rw?]+uwB+ :@+;W`3pJBkw+WWKxJyŤ \ZvZkJO!•Ƭ.;{WhS]+V pVJKpupec+0h]+W`WWIaW`+: iWh":D(:W`tW 5l t2Ӌ#onW2X<{0x?s;Yi)Wꡟs\OWhw]+VspV Apup%Kp%ev\Wh2Whe\= \I0+0xWhL0:pVZ ZDw O z\)Wh3WWZ2T u\'WZrZhH\\3Z~\Ҋ!:r+:W`SdWh2]+(!•.yWN8\Wh`K6W.yWh\Zʥ+m Qix~>4ʱ d8:z=hbwS-lnL3\İTmWu,R1\cW] {ٛǺ*{GX1$_iʣa_ϊbb!Ww[]yYr$dlL[u./CpweXee uΎ6=i۫.}m,hIl-5:\ > X8 J/o|GQ3 7wŬQCQ?{ >Dسji^{Y5N?^ο+gH=w:#q3RwIWw{~? w{;`]$L4_.d._0lDvDo?*vw]ϔfo}5פfʶ;Oww\ԟWJys5*{x`K0[]d*XXO>ᤛ[!gvv7-nuUeᏐͲ+0@\7b{ Ni6{tBڜ#0Nv>(12B͋񒕟cFwx -7h#=b>+dxcWˎ]!Y7n҄፬d _͏}K$ _;A`A`"xco^l}XDH%ks~OyZzO B'O<Қnm.[m_-Jx#HXkq,a dz8tx"Zmo\q,0M =WJ48mo6UJ C6 Ge9T2m-Hnm!>l><ν8Ӥ]_z.0lڡ#l5E|;h_vPNL }vY.'|| 619Qéq]pr(ncVn]+u{0{jCjsW-i&dZtY*dM+Yҥw%|-~zzp+S_fRUWYG-mV>I׼M)׼?w>JW]8Uto_5hU_li}>6׫ ^=0FO?HSt4F$ Hc{D6-""&=[a$\ZkBǕTY&j8RJS0)W(װXpj6jJii#6F+ˣ]Zw*J D+I\M jBCL qe5c\G+l A.P-ӡ U66OJWB+?oK=Ԋ+?*bW"XSejsw] J Pf Up5@\1FlwYM7 s51MvG[`ZnF&uo-|]rx-EWkpxDpx1\A.<^u:\QR +7׻ V4\\Mj W2I Wg|ZT\`O * P1 >&\ WRebʴ`-M4B&`B:Peh ΂+e<"\`O j @&$t\f%\WbP hpj)S U4b+,h!32U ;nYpnP$⏗9U D2gD6CĕbRژ+T< 5&\ZMyB<CĕfVh@ hR dTLYpe83RG+ly2~F,2uìi#x^tx5|$")Wʱ,o,O : Mr^ClPiC"g <٤WL{q?}OSkKaW\Jl6?VƂ+PK W$\ WB&c A0g<\\wjU42jw-ǻBƂ+Tm踒+*ՎM< Ok1?*jhwe"` .bF+P\ޞp5\ [ @.':\ZC$jZZ+#•Պh PWsWR7_ (igzJ.؊+O畘~r9w"Z4QO2`Zpu) pZw Z`0ZBB%Qw­́Sۯ7T3*ޞÛ~,rp=~Ur7)?m&q|g~8ӿ@?iy]'^s}[vۛ|wpB7,RЗWx0>.:- S?,fYén:fᶻAG?O,K\з^hÌ;0-G'm p(c|\>*|kbH[la}0ݗãt7߼OV6п߹Csr\|>d ?cR]12^(%BV85YQ\LU7t𒚂b} JFkɭnϖqO+ {rtLZlzwS/4 ~[]NR?7}iגGv\tei~XN˿5? l2{ܻ%8*2ob,$/]ood G QnTE?.2.*]I(,Se9erít&*+ )x]$;ރ֔-W~C-\P.(]j}#Οܵ-Z=Û~pff#kN?zw ӭ,(t2~.gŀj =D&;`)Xj3TϸV WEV RҬ2<_-u^JE+@CkPt?+uBVN Z7$*>UidF Td2B)L(\\|5MqRvC3B( Gȫb2$sZ"*ɪq(X%pteVfrǝOceCpuyO_̧%ߴe =~nГT2m|tScYAq$guu#rfr[EmeD@ݢ܏P_g]:|l3Z Rrhʉ(iF?gn 6 [#(8aixt7eڑ ( iIOM 2= X1h,{PmO*D=C\cl! tW8s9vSx魼`\UcWX831or vlL:j1:i-;nƳ/km#GE8/;{vx'kyfg 0b_60x)Bl#I&6[*t,9_ae2ٯ~nJMYVZs՜Vn]\l'WTsW2[tB>yֺCp^HSS E|'Mn%TC%r6rNd0i/Mv|Bp!c"|609 2B7= U/;$#DLAH<)DWހ g(;9rM= g`n|Ə\(/_D~=Q{?u,ZWÀW_ nVpgON''g=TW1ҤʡJ2nʖY3:C<.IRQ琘! ӓ7-e^a[) $GQq,] W㳴0+>6WL"r: Iˍ ,4*a~#b_wM.')K5{?alcO2aL׼ă#H()&8㒉 vtb\#?y8>_.VܮhM~<;=4oI=u#ڻYbyO1dZ(2$Nή=s|>58Z{lwA:V{ \ұ,Ws_yT,oM7wX!jĿƓQ_r?|?sҩKt0`obG0pykA5Z:եk稛 /m>X+xsH_qJ_/Է?\b4J˯۴!(6$jpn!0À@]r= ܦد#U$H 9+o&Zq F=]}UPRdsNhAX?_Ӌ!ZdP.1)fޗH,묕 AaQڬՂ0)tN)=ud)092ԁg)3ˀjR. 9k6V'Δ̃qQ8j!K^.*Kiok+RWQmcj ^0ӧ-.:J܃[x*:_r-|zDՈ'`[ѣG@!&q>FJrؔsR,>Y^'Y 1Yo% {瑩0@=2U~ LQL)U-\'N7yenҷs=M쳧3J_:『^Y'AGk/OO?=#ZIcx{Y)|eJGN4,8kVC2ү6 )e&!)\2YYDA-uíĔO('CV mh/|f˚k)J>ޱ7v#'*@SKSU8#QmL?Mjp#7B!׃4>nL3}aD.t9-EB7cgw.ׄyT`A yUN7KzŝiT(ʹNA3}Trn&+Ynl8M-?*H$֕ٗ}dq6S@+e8^iz@yC!fqV &cT\`} b3:W1u+ 4 2nCZ(I(HCJgYrJ',H%LE]JwKD z荝Qcu1~J8.>KS; ,a; 2:b 0 <ֈyMjqeY6-Qp.({A]MPX꾂ͼz/}x{g<u ;guZ $-x.ݵ~G_>zUJQT2rY |(tA'gRoTOu>bCff~esп^N]1ڠ3= &+QpS(a>x#ֆX^zy}(wq6q iәSSȐIմIORESEYe^0$#,mz(2+w0H'e-7x `xpriyˬh '2ā&^ ֿ^!I8 @y^OsB["Qd_Rr.YEdA\`wD i,G4O,ӏ.O zZE y6ɣc2A6DD<֓: Ŭ8PJd%H=Y$O_ʺ?p; KWm s ;uk㜆?2B%БՐD]hכuZٮP>a?2ݙ /!n%A4~W{۫J.$-V(32I k`D !0$9=Gtu] ϧٝ*9"8eF~=؇{s0H6EgvR_dN{uӅ&u܇o囘j(8^oZhk:2/m| E"7JN[ï{+P y:`].]8mˊe}N}Gu@rzqQ՞^BB RK4#6 Rws[E ICI6.t"ٶm=mz/D?EA]k?Ȼގmuqb|B6Oc!dّwEϯxԴ&rJTcbNF\uQV9wSfmnp1jvi,ɸ ɲtF5/6{Qm6 /$z}wtTk^;ZVD֍zP7xVRLfY(]B%J`6Zv=ү-OR:ocz?ay iy tu7 x6kBK Ltf%b!b!Y6 $A" :~[xz|M%n:#oIC EK++=ZS`s[p|:JyC48VqW)#IYye`⧇<_C`ф laK ӁeBv69wNQp/}(wQ33DyZb i,_J~̿m)T&ѯ,e vQm-A ub,xnJY9Q~m{  sRywr ve$:iS;{cgފ_6r+ٿb9JXd X\``//=MfP=K3z-dթsUgg} ܧs{_J\ M芃|%e/S+ !-Vo++b[QZp-PbiYrK8ˢkK^Pt"+kK[eΧ&n8l3k5 xlIVgGY<[ybp6!䂐DI xC]"30ugH l%?s'wqb$G9 ߹(.u-գG;i@Yu]S ^ItE9gXSLqӯ[\]>p`fH'}z5z*9/@삝 EBthh\R;ΎۘqC;F9%'BTxz %.70Vj FKPgٔ%1>-?0{wOvFlx*=PWT○gg+]/e?~x۳eg՗ .K˂ql;C?3UK͛51%%!d` U8wW?9*ϫIo38}ʼHKlXݧq9ZWXw7)g/pPgVi?W+} ?d41Ft{G쭛3~#@ThϏֹ߾^c$y߶U\s./8iXh}1Oː}w: \Qxoiid{# e: NW'8"UO'+&z=eQ/f"e v!܁e};f۵3wgV̿c&h%Z")ERvVRTp6Ң#2Zmubk0$nHjfrs#M;^)ba5f3oJ ٻT>׎P.PTml|M&L,I]E;UH{Er4RʼaETk4d .{Ֆl8#Hu(2,H9Pg3@m˲sv>J3ǝ)0l[AvSͻd]Ins/N\ކ]ן+Mekמ(%pVT~ĖA$5QM =V{pƴBb}RF(%֎ky֞v>@O+R㚚mMJ"KDSH͍02Uf;sg\ޱ+>0x7p/sknwx~t4ۏwN0I 96Q$EH" )/.:KxjqpvʥTo{jL!5gkL; A ) yO@ԪGsHeŚBTDJ9G;ʚh\-l<̝Ћ )>==/ck_ }'P]/`p>?|%VQVg6zan}kЮW^VxV^\`7psEi_k4jw`3*(\Haojkþհw] +WbT{W\}+Xk-:\ +p•(\vo }+X/t/KOb_V VEf}{WOֿ|qWtm oj㓱,"|sޒNG~vuZD.8J0ܒTg+ k]-z(9vwF?m .*k!gC򅀋e >ŻX?Z_-ϟ~Q围ќec|qvҮ;1_e^̽.}}vviZEX-f}o<2-("Xt?6$\}rCbO>6:H[獆YchvZ6>ے>r|qWgՉݭ Omw,'MsLynZ{rK pFsW^j4㎬EIܡ9U䓕O{itv^jCS{މ?<5_f|`17vXOȱ_udYq5y6S(@%$Q(㔬~O˵I?nY{_ wil~B/Ok\NDr ɏ}FqGGU|˵6$͙9RqΤj۬>-Jܣ!mkG25hsdoPjغpXs5F8Kr'3yn`̉1!vZp9z i-V:r9{jʗzl]"L&2ͤ|#N͘k9{ 5x3[F#pKP̍^ .S2!ƱgGg}Չ!SphCR`xd[oalsRa "1C~DriXhDw.1q s`և{Ҩ^ 0Y7FyK~ߵ'χl$J*dwi/"Z3<%5'$g(xcF 4P4-wZp4bMw*f2+d atuľ2'%1ICS"N'iKGZGH sm-"%,H/&MYFskB.@5#T!Z8`j^#ِk j*Ox@nּX\)NԘ 搭&4Z'z.V-d)9 f9x Nk@]0,)";x{j cm.5SGV Lc&Eˌ`4Z2V<e r-xbjQ T9\8kӌCe5.&ʉs:91d0)ώJg]ڈnE .)h0GI;5>")sZX|/@Q%il<`}u5RĆ ,cdFI4AI&9:i,97/wMkuXQ;!SefP57.i%Wf AX'@ !iu41   cD?%c 7HpPRp 35h9oLN;r % *A*2ә1`Hq 92rܤP5pȳ 449x5=+))E>RCo yIV3mcPH(#*q ZM:>An7U+-FPZti-JN t7(Һ7fLBY(1 k'"!`"'}y;3f2q`QN.F[*vւ7P e= tQ" o d@X8piLd`Ldw%8AJ KrRT2|x #. 'ic}YytBdȃVs IVDCt,rkptXXyq`i/96 %֕1H#3pp>cLt#[ӠϋئUĝ7mmd-Za8TxVA8~vrEwJ.$=`fD߸ ^{@#"%xE1hC$*.h W, a{vwP P1 Q{pki,3mј-hgW1;V@N >^Ak۝2irFU1EY6-jhԳg@IIa1?%窒8YLP8tY26!Y a#c`v$x ,:Z&VXyꌥ,ɹ0deAAmG1*r槛[CZozPac6RSp wN Ryf4ueccɉ!peR?]s4\S;erXV@uRAhSp{ AYKȴ7% (=h(f Z?)a8tc/ʲ04,;Tv#8 8BK t@*C<@p! v3X-1W,Rc D`&'h,nAFCL]bC Pq18- Fi e0 t(OW8Dw"UFŸ` @0=ի0Vh7{%e (׀*Md .ӕ`m/kCp69F$'jQdҳEf-@n*嚝Xn  Y@UN ]rz@X Z}"\@'*c8: N t@B': N t@B': N t@B': N t@B': N t@B': N t@S״K8f=.78wJ tN 2*t@B': N t@B': N t@B': N t@B': N t@B': N t@B':N TyJSS]8-cNR0tB : N t@B': N t@B': N t@B': N t@B': N t@B': N t@*/մ'Vt ,A'I:f_6֡@B': N t@B': N t@B': N t@B': N t@B': N t@B': 8>^/kf~WMK?R{0N0H|Vq ؘzKWjKy")4.q铦AQ]`CM5tUrY ]Ru JMNR SMZ 2&NWbE:ڪA QbQZV ]Zx骠Jinm6*htUP:IҺ<}EEWf5Zotu:te &*-Lyk'<]#] ]YSV4 `Uϋ  O JCWMΉєɣV9=6]mvh9]mrmYNۂծMO-!WDWX ^ ]ѲմtE-YI$0Uy-tUZuQ+nR"j*huҠ:EiUS2MEsW=ۡռtUPZtut/`;])IE BW3uB:E*kZ]L=`Y'UtU\THWHWJmEtw.UA+;d8~teQ'X,WVCW~uo3ƄGKqdp;*r$uJ1߂8ծMOn!+9i7qGm ތ Cx;PQ`[=FCT8jb4äkܿSg@},CMF( tWͬ#9izW c/TwoeN*wB/F]~voyr-Yzrަis- [O #XM=I9P';]os/|Y9}l =s~odP,hoDxt&e2cjN9j=r9Kmȍ6^zdV֪_+5)ld{hIl 0'eɽY=F*`]lJL9.k] H^Q 9դ.S]O J0>1dvKBW)u\ +l/վtzɱ_ -t(BHWCW.j+k`52eNWRHtut%|$JUCWfUܕΫ+@)]"])%I]`EY5tUjUAO$K] ]*hM ʵΐN mjʔ׼t-tUвΫkq+++ \Z͓UA)pۡ+̦DqFj;دnWY]mXPvmC] ]VhMj T!y=K/*l@( /ج,LpoDFJKgYKPD(=`RʊH֦-pd҄lA), ,S4!K5nVUANF [omc-&JƐNּ*pm5 JjNW%HW'HWR[jjLjUUAkU PrJIHV#`4;'W)nxR~l7ӫWgwn|_^܌bnج;|,.w472{_Vʤĺ$Ѩ.64p\)4m,ӇahtdtO<<{N!\usf M]9ĭ堳3f*Ǎ3zDJ[GY$LP4,ٔzxQm6T!O=U'tތ^P7l}_'(pe\.hu'i JOr\6 fB%nh@۵kooGkYi <_||T#bϗ,;7zs7o%ʿ/onPXca44YLc'45 R9wʫ+t cދePw{T _b?ڱs9NfwpA7_i/< p@R-*}u:Kj-~zwx 5[: Rg &(_#l93"r.;%+̃c'D_?~)8^DGߵ*6˵`"'MT/Ri4Al$b0a(^@ךݍ>?o;w-M#%]صDt_>{$YHr2_%dķn쪏_Eඪs$P%"eM PdHr@69zr'S eXWgYjF̱Ya!Y Ngv|SG Z]̞n,O`·V*Kl*|qճ95HG4WҘ Um&CjԱj/4f.֯> c#:"it29ͳsF0 ( f; |,gx~,sXaev>^Wm__pޟuL|*IFmN)2zX2h*51"@2XY堳QR{` A20m4*"'$u󑣱_X1#BiwuJ?͈ɇ=LJsYe;,_:$Ͳ'aR>+} V$g2r..C 2[xd:?=tBO<5MD`:r!Km6IK jYx&S+?`TV6|O9 sJlٷ %X/JMϨ(/ArONYzoƺI-:lv_Q82K{ڞpUq!+ ^ xyMЅf6 _Y=*&uhhA o$3SQ+ˬJk.5Cǻ=WtO+~TϜVp*Y*H=8w8Zpޥq7O̟Z5mʎ֭=uOZ:0~Kdmho#;V;۾c\Z-r]|k\\luօ^Ԟ5cz%rFFf`)4_ $-/sKǶ%Yw,zaʥ sh`D㒕luc!`J$V 3pR r)b"GwV^]Vg y[!bb+x8cz6l(0x0dТ*b #a(Ξ ZF$?3vS)-!eTFjFl?.0Zq,m{6!CgG$%I $oeC8wL%Qu2&9uYu7K] sN,*&&4$0Aq$?8,SйxXMx:,ǂ@DT.".G.4ApIRq8+PGRzZ3z-F*"4J򍓒۬pɄ@4H9ٍjO8[SPy=.nxM̆y<=K%UY`[]ʇZG l$9yAiqca58< lǦFv |8qF&;g?>Q#y8/pw2AjM^THuJ<6Ht!v 555>CQ,y!WKd۪%BRv:YbHֶF'PԇJezDS XbbYLPbȤx 36VFM,vWv|hxZu,>@wө0&vL-i Z' #`e(0GTLZ *0S+%Xw CAD)y $RgM"YIZΉ KjWg\zУ!k49ݨ{E0N.Xݑ[-h!WӐ::Y]~HeJYLEgs?|ywCeR\7^0kcIuQHSR &)d>F ]r%L9Y"[n5=ȂU Fe *:)㘳#5*V ,cQY.g榪9h,ȉYWb_LBL +#&%4Z$UgO;k?߁$mQR?>&`N%J 6C>IrRgCP$b^N 1co5MKXKagE<(-hgG*_*Gv!ԺZըVz?{ܕ /Grb1)C͞hҫa108M rjUӉ2x̴u{^{)IPibևhA XL2 # L(mzk!ٮ)ZE6j*rAe[a,w@>ˁM3ʸ@|1, FJ[؂dAocMƎ`N5 9:De/1z5+kp|oioYqδHUBBlD&ǫz75&t17~5ʸ D۰}#S44%TKNm~'7o~6<-Xë6ƛw4M'_W}2_=c%Z?q9,5/Dn*˔A1,"9X NBrF$nw"*m'5o/C}?;ҥA߮W.@ق80+ydR\9:z5n$y|Kuwfn# D. rnDufF9RMdk8ss973boky 4mu3dx<^^ă6 3$%RF7s̺ "ـD2*g$k2K($I ;HD E=y7RrbWw$ ]?4at5+G>TiQ:@Q$@0lTdɑ!$-Tz*)YKt2ƺaehyi>r+\: @ 8њe RثzzmVS^)hivEY{CsΚIdV#CkȖ 4B[R@zCkHoT{UnFi0k_K@5Eޓ7WB|%"rg(4 7/c`ޅb K*1ſ:䲺ƞ2_l+w'|?L׌<ᆜb N.j!'n6K~t}2^mVߍfIҬ\*Kޥr—Фa-nmlv|hQϋg8}n#uҵ@%?~["ѫWӛj=QX}:N^wQz4)zh*jgZ=X^Œkeqon;} pGó eloH?_s7^(DΞn:,cOZ,Xfxtٗ>SW`{]=dWJ^ge!ltH}~LCijF%"~O`X Jȶ4I'p<sR?o>}ߖ^O>p0͇w?Y4hb8,/U}][5ZO׺S/|wdV5߯G/Sbznm H~=B9PzٮE/&yB \ >Bb_rY &Z~-B C0"~[VSWlGd[.X. j޵#ٿbS77!W`4vigUrKv=ūĒ%6ĭKSExx#uuN+od`NUc?F`."'u)WɻFʒ'6JX/13ђO!TonA4$/tre0vd`K!RZdmQX4MѓY +>([p B. *CtVoj(t\Ȏo"R(t ]4byC.f$QWlwY2[R^Z;%kk&( :-/-y05mӚCP 2 yዐA"O)zpY儉yפeyR3q6wcۮN_6Syw1fK1Ο>P3OdD_y8`b꯰NT)d_%ƹ@X뵠iөymW/ak8|&0+kj/?&+y7)Rx}Kfb규Ē }֟wY):iF?_ iNgwY|lu۩}:ɋii-X =6f\N~~ɴW'_0_ӓi6d N_L;I-rm~ɼ kɗx!x^ψs x0cO?~ lB SiꢂЙ{bb[nזSQjO}EH*5/b,QP>ڑTI$<$ڦI3B" {t22X HԠ䤀ʐZj&cxDx]|^  Gq7??X{>F TgdUTd. \x(;LŘ&(R,t1mygy2McE"&8+ `1X쭗2wu㼎˪˫=.]&[0yӁs bCNM``h'F=[5+Gުq^LX#VR2 Vad.3/Y{Ej(RܤH(t@ I!x7aͼ+Ջ5W^wGkHRwy$#29A&.^\&eUJt T49s6N>dJWw(=,> ʐsEI.Y%2A(T,(:u{ ,dVŲcAI Jcƀ:$  J6Sfl4'l:VTZ~>'z.Нy57oϟ3у}sq;}R+pv@ 7|_IqyS`E|5ɣe>y0f]aQ7H alZ;=-M> o/`ÙڠAĄc\f\߄|@oG*!y}ԓZ"dov6 3SewojWϯO5t:ȮH]wbB׫^m>gi&Xy@rwL0Pi9-híXY k$:6:xe7('@;(b}Z,)СVc z;,09kĮҺ7YJ%`b?CsTpU p W,83+4UFc+ZupU13+-J '{e;K+?tR` >G2RJ#+UWUVC*U\=G2bG}XagN.xRՒ֘woՂKa`K~o{ϟƿ7hr@b7$:/|\qEi"Kx_+w,YY^U~׼]So{6RDSl J+tJ6or9!a)5q,XXP|!e{-woMiEx]o_.R) /]9.WO=Egs̡:3CB_PXO<ZZ }b+Ygʹu(ŢW\֑?g^ @AZT !D+ 2+I ~;390]Jk S "Q(K6Ft3\QR1)[lҫ(MjT49?kY%@`MbAiF)SVԺj+qߨOnjUN!˛RA526~dlUaa3 }c,t>(JLvu]BXӇR|YY'ڠ Ogg*(WdSR5%Ӣbw"{T 48oEZ"i^ T¦26e;v +o(ɺEZZs?bE^21ҎCAm?`ivl{o=%E`-P.J6Ӝv!1 V絚IO؂2 0g%`ǂT~fPwz/A'`M|ux 7&ƣ_o<#=T\ȱv1*-y U}ԊK5a!6b.TLo Tm'&үߛf 7w][o9+yKf`3dmȒFRx?ŖdKZ-9 .,V]lmӷU˒“]ֺTuw(q @]n689]#@t_kAsJ|0 d)mjC7[W=נ?h"6@Q0)R!rߖ|JDD`V8&zз־'_0]?V8Lu0o.򨭀, }ID2@gG(UECp+c:DPpȅ`I+\୧&J:P2Kmj9;rk='o$Ύ^˚DynL`BXxh 6s5(L'(KfI]84-jnT*4:fZ:Ea4 cU&CTtpnhFZ2 PE:k"ŭַCW_v}BNz\tڋ2 -i6(E iYTN:N_DKN۟ki8Z8s\fL01Č4PDZJl-gDjw"=}w!9Hom~6(\߆^;bJ(YH?>i"@ll3Ful0$Ks^쁕$ѥP ӱ36_w\O꽧L/Ns'|"ekZ?Dqј-%OxWG#jl =fB|d ǒPȌ͕us H?{óӢE8͓ףf/띺xڐ,FFjKtrr<~TIU:}<LNʥM]`lUDky09R[Ӣto^l3_O4;_|vmh f/0gflW6HՆ?};ozp8T5\ES5YU4G,XVppMl[mU.;Tkc_iZsTH}>9?M~^?G7@ܱcEa[j҉ͅǽ:ߟ{ߕߜ_?})_ߞ :aᨱ 櫛W5گhݫve.U6{if;}YkBۏkk=ӗ747WKO:ݬĊts=L 5r)+mgl9s9+I:8n2;ar|,n`Su՝.Th5rRE JG'eN"jA{ KLL&o+wA6_ˢ7GC "8+tcPڦ|mǼvt>aU<+3ƥd| y &!:] ke* *gБp V3Ş4}VcDpiRgk0't6-D9bLob&1GBAZ׮S˞ UQ[!Kב[t: JޅT Ykäen ֜v]A& YVoɓFcY՝ /_D{DO~co`w ë/W5XdT|x6h 8-gAL:U,aFwrL`TKfI $Cb BzF2F͠'4 vaAZea e )@.2KəeN,~#$0n֜.lW B=[~'W׹ĶoO֐u4]׵;.l@W^T^e$XfXw=j2/֮ڴiej .HhG sx[TuKj^zxe׻u4kTB[wmxH)|WJnׇhEv.y]t~4̻q3VVeEw ƲtS9IAGҟ9*s qsss /l׍7ͭRQK)7WА)W"T&jLFcr@Rn}l7j"L 21.*&% [[ޱL[V%{<8M_mS_]Tg|0[Az'>:i@F#76@F]RN jkH?b7[{f580[Yq߄ڀ;B1Rߟ i&Q<"2R4EMƃ6md!TDUV1sMepwkRx%d7e Rpi6k6|mz7OTԻ(L^ڨH:\X~:ա{Qv@- &Vi*u@qPK98NJTj--tR.7 ͌~2bH$uHG|*@\u>HVy^!\Ad ** %B ڎ$*xWӥC5,Fbhm&G\4s LZI笑9u԰jxW^+lΠѼ&}. /G44<֞ ] *a 1SIV1<\rdrӹ>+ .>?x&k}CiP -Bf1 pDO5a^mֲ,#:5&{͸< dF] T &j@@7'[N뜥毻񖎻wU=1wїI+@iJPN#2sڠ)v<vxל\ܲf{*~|Hc>/'.Qkj5Ϟ%B9i(8qEE`ڰݽ>\L 18~bYnI~4?;>Bv(IOl?IZ Ye^0Td# Mz(2pF*G]g"]^qg:F;3D)d&KF%ÅH݅%wd=zukeoR' wWyJH>$gtV5iA%kܢ6rQ]>A*Gt+)KtVhc>Rk-3<dv:hz:,IwhLDcmryW4wM[sUnr2n&![Ȋly̚9l0r3vK7Ӆ7{o4OS+|HSkw$A#2Ib<&Jƣap4,=2J(8;,gz7_r}109NrrLy ^mad V.jecdY]U)HmgqP. 3P70wi[7IQ M1K™7Bu۝>OW]nGo{3V:Aef3gY!u@H;׭G+F=$ӊ?PqÇL{OpE"DF4]8c3LzԈmGN0d}C9q̷޳I;܎'KtYzDq:jV҄/3lӉX:^r>˃jj%0f|A#߽n>2AS;P!QguQ7ɇ<"!bi[\l|!R\FBiJ_09BOgycxuKײ3TqX nQym޲$冓cnYwμS}ꛈݘ5㿡ԣAA4 TEIGA6$PAIlW*cSPheVrCQ<]TNv jp Ȩg qȐ u"U#z]N:emWuëSF\xuҘQhfP 㵧ͨ(zpݛfدR2L Xi̬I$|l>hG>y鄵DzP0 (#ɱRL f6T8My.("fmVVh33ޏoa~x<[/Qi,%LG5c K8Ya"DGOcްKG"h@Y|(rrDIBwpml+qk c齳_MJ5zdN+kFS[jdz2w;Kۍ\_N$_nTzZeeDSnGL9ad ( BI|(B(2'A =Z xW0)f9i =JH&:!ȕ\lVS@,P!Ƅ,D3vF*وDK>ޅZ09uq$WoP\Q Tj߉`وDf"(N]\%*gF\qŨRg%`وD.W" JTjՈ7(8#J+t6*F\*QɚoQ\ ƅg$`Ivg &j>uqԺWoQ\)$fJw'8ɝ8֣[͜snI󃫪H(mJY^-Ⱥ"0| Yv%>"Xa >Ϛ|YR6n#ʤdEu›=Ӹ51"h*cgq@VE} |bo3!&ջ>E?362Ky4H)ʝj;i^;\Y/a|,;xÿ.Opyj'<>{ŦkqX& w7CwEDiPwH*t1ADxhYH{^BCdf\ɕ =؎o/Zw)f0(]챘{.=Zlʫٟj) ›рmɌC3Kw 澉[+OH^A_G:xfDQ/6P룵Lx" Q$84QK M*=SL!c5*ǎB MjN)`h  +B*Pxe$M^|"lySeJ,MҺ%l1Kh$ij74aAf:.Nנx sZdd3k,PV8dntrfk5J#K!1 H#E`S%^sI:\d1V`D9DŽRn}Wʘ2f1,"1rbrX+IJP!$n`T9Ѣ: h?T9/=|AP@\C>/}W3uA\(kWq&ݡt5%e*،(2 hDVXmp;MƯ&e "+nJiSX]pJhHYRZheq֧nU"bK(F[ :aFhNPƚ`ml\~6 g>y|y u(/$6YuhY«73á@0Ps6\z6Z.N7Q)oqǭ-_ujЪ)ɜUS3x(°DR4oѪOUjhA q1b#Aʀn-)BP`qI%3zj AHRVa S띱Vc&ye4zl5&77!-5kյeB6 1Pr:jYoS".SE*G0Re }&ŽSuqc+[բV#gƥgX]V-k EX$N[̴4Oը4I(h'Ցj6rE{LIX8F.eZ7@"A@p[/5^#j61FΖ& a"_q1'J.V~]xN Y=ȟnmz_X GpN&LbIgk`5B9`4<*ba!!L7H:\\g4HΙ u9U;"E RTK*"Q* J3KF$O]cPy tC&j6GYNVfcIasITQ DP `' q͕c"_Sۏu4mWlz7RC5c~?3 _ڰ|A <E#Q\uBg'c%%;0I>6L"rSG@ ,Jf<N6Acb2۞"?%D6o<اւ2_d.e SD.b.`l`B2ttVVÀbRC(XaZ.Sn\J%‚mn&. aM|=eZ[>G r`eߍ*%ѶCZiJ^ZWM1TS?FpVS(UdZss*[w~o|7ޗnf bBu~0_ikKtmg Ej_FIwnMw}ӫʑ_=C0z*e}r+V1nx1iWJQǓrdAz{$ ́}~4{ȯJ+|[MǒKJS<-n\!|o~xS`$AL"@X_7l?&u ㍇m3soUSwyp1=s&gk(e廮=p)}כc'oKYԇ:t}ˏd6(6ۃ+ȩq*9U"DTم,6MN6t.y4&VIkjE.)J ғi4E6I6\?5؊U/ƾٔJcBҷԅ 1ӃaeFF9ֲణ:ȍ JxކF;ud,|uJZJ#q9^ribY.:J1bD"")-HP0=Ⱦ֮xb049ĤC^E« I'Oz+ &-^ȸ]0E%VbRcg6(Y8SLYQ2ކVzڥ Nre5mX\ w Ux6n'jn7pcr/T0 \&5c*5söAZ/F>҂ ;ueZHq)+2e[Ƣc\gwugTrH8+jP׫B]ˌu)2$Up;op/*j|sOB NV.44Xk_qfUϤ<_.y{TgsOޕ6 8BP/ LrFbgOD.-6Q+ȩgMTlo&[Jvʹgw&ԤbcGgL22沧u҃2$Θ8SHӈ4'=|Li1̬&5TH%c4L"D"ZR8^QX?Bԃ1",DMg2""&ZH0<)c"5FΖ}k\͗}s;^1t~V+s.rrNaytv T޵q,Bi8cwW`'NNf fiM ) ߷z8./%%zSX˲,k15ܵvUڑ!=eꞁ1_p_m}UaZZ"or~m7ҶirpܵF+T*A'//O3sYZ WOFy=:z0t|hCtrrEZd5^2 H fQgdf>DxtLϟ_o礏q_~'MOgҝ>ɏ>LVۏBW20 Ea$d gή!A 0<0Hh8pf2y/@, ,IBv&rG衈֋mGLo^oT9yt4ݚ4b{wOO2Iyp~': G #1b, .# s:3]=k!sC6tDcu1dɹ6IDF8au3g2<^x޿ꑑ[A`"GeʕeÌE\ Kܓ'JlV_ꏰ}o7]56Qq(6)v wTċN ]9CN" 'LFB`&'MXBvNRCCr)!Sns[q6Nת_-FːW_z~z;D捦}1|_Ejfgs[oOv/AzF9jwrFtOgoi;NbL)\3?;'YϒJr1 q׋pN6qNӹg7%pR䒖<)Nf9}WIqљd?I%c"LiߧiN~T~B?Y~=#7A !IC]t"\gd+}7Ph֜/5~O?f+^]^2'[7['qh!{39Mt_4C0j$'qfv f\-u 5\2u`͌ݤ.ig7vR.}*4C&Ztݵռ+wpnD-$=dLvB)q 'F#a%s*&nt\//]w Z"y˳'_.*+ b8.Ĕr|ث`d:9Wks|u58`K3/#'|._v {Ж+#BDu3SƉ\A_o0~e8ɵAG kZFLe^3)36蜒AJO&IS3*xT^yW~kŻyݍt=&pMhu!~u|z9 SJK-$t1;/Qq)bt&s# kAWޏ&Ύ!S=FrwLƹ$t09q))j-[<:O2W9LIh[h0ӞkBR xB!IOFb@(Bɵ'a6T6jl7T,f䖦Y8Y ;|4y(3p[K39FjSiQw ۈP3C]Ƭc)pɹpQk攐3]j*EL'k:S9K4d\ \IR8ۑR iƾX*cmXxZC*13^8ٿ\VY?< wŀ#(l"ITR$!`)U8-IkgYtJ9Y*  fY R m&vN{ JQ$!pXv uR6zQ _#>O9Xdokя(3zc_pw:IP!QMNy!0,gKi5Z c'Y(C2^yF/mTn5&ѠBnՐC2vuK 8!o.E&r<7"R,pϝ6d@^ vjl LWqC yiyhPy?x֋/b>4ߖv?Ӏ$?)Lg}Eǻ!^?_X~*C^( /3݊3:a:|?9?NɰTY/g^E?vXiR\K Kdd o艈-;Ջ_/E_:PV&>?:wCo:|.~.ne"KR[ia`qNF?.zv1p槞//|P˳էM = Ѣ,;*ުs刴qyz9[z>~$\^\. 粒guD!:)3!yd9m@`u/=O MitdaK3?x%+VLj} UرUm( KkVɽ:qu_Bj&'L dtف8)I٧:zz|7r :Nw{ڶH;.J) -Rm+oSgD tNVn)<׳5^|u;.jruZ+ N .G$XL&ʗܢnf #%O+6 O 9I:΍W%Ijg8TgTg[֨֗־}+#YfN)Gp>Icnƪ3DzyDeQ\MKĩ`ŒJQH))$c2z R蒝b!N,vXb^0ZEԨ(M $"dC&935U YFѧ9IT28DŽlS2VkC'*Y5qvޮ ̬WMJ–v1[js(NLrIjIuDbǟSvɄ8WNhi&P>>z~d/T*YD@{>#Tɖ^ɸo+5RjUsŸƓV{ܵ!F!2(@c,Yz , 'S2YVlv㨚*UӉTiow^Ed&O!:%DȴE@{Q8k knVoGd%}4ȾoyDFC8DӬȟEfJMJ+ ȱfcUju!MyYXgӐ-Aޑ((ˀlH&䫶`l҂MZI 6i&m4m&ZI 6i&-ؤ`l҂MOj&-ؤ`lҬ`l҂MZI 6i&t1؄,JVgl[qVgl[qVgl[qƊ€P>Y}"7[)%aek,:8}-vKd%jɲ8lH* (cx Q3o.Р @1 9b]2\zl)쑬|@ΤPHB ;D.AE`j=j74EKb [#gKi-þYiô,q3oM5Qh9*APGU4dr(IJHRDEgH@ܰ` 0#亇f ,4rhc(`Zb%!Wt knFf.尒Q8yN>,ߵٿOuNVYꬓM6W: 0[#a(䪯(R/[O r,WɚJue~ dǿg?ݏoξe={u#0I# zg~݉cv A훶Y܈mVmz:ef}|+{k8e?C||;NݮZq?,؋Ol\uUTT>Ur1*\JMviިq&XOQqE 4<粇4I2I=EnA'dXp&S:E;3#H)$,D-$R9m2R"&J'Z :0%XwQ]cl'^5<+:V·i|Q4zA!u )'&!Ra1D+R {XᶵNaAw<>m-%TYhL) 8Z;⩓QI,XԎ$Rhn09n %u4qZ0-xk"&DHFXfT#JqqRklt=HG܂wRWKծ|KWp=QLy7lݸ9vБ䖙I9]<8$GZǷ*p8ӂ#I (Hk}Lރ'p$ʉH$Ă Ni7"qb42b0RLj!9S1H\R@ %' ep9wE G}m ?l gd?_]—4cOuWYu FEVB2F#S`9OшJC$nx(4TJCC"N:ap(g .i9+m<~co'>.]~9@nryvVR;kgEE")˭<:%rޛWkwS} T4Anp\uViel}^|XD9NPɆQWWzy50U\q-|_c[u}P}'P${ќԬH'+)aUj7q(aTA)/Z0D-V^G|gIӫ@f#$2Hr`#Lc7ciHʂX"p+#J)hBEO #xz2Υ :>[mT%20eLH{VԶ9R:I5vimNj1r.l9x_݅O; T /Q~b@#>] M{@8xRRo$D BD'_H@{8%~&IzI7 ydAu `F{' (WJF5hg~ڇ/.oz\Vy]}<#^~ǵ.XCpठ@WS$MP Fi2A`+ 𨒎?/ggd uw'rAW"9={bw_-.:_]l7WEB=]ƽw]V[ڔ^^ŏ~Vܺ9!:T<Ӝ|mf%HA^*ax=@ ({,7*q-\9&4^~*7eɴ]SR=VqɒzԙbVQo a9C- z6ăZF"J;HDJPR-m G _=_X~:ա(k=u  g7i=Ғ 7ߝQ6N?zQ!c={NrZ!:2D!1QFtJTģP%F;zq/f5'mޓ_R<} dcwa[v[{"%slder}. 2Z{L=.Aַ *kXsk2h8=˕Pha#P!Bq32TGBE@B(J C,-$ODhG=Υ Ѕg8 XEJkC{B6:%'4!,h5(\@8CuۅghT> cZk޿7<`ڗ2.BC0ȄȢ3wZh0ICSjJKJZ9\s:){ 1׀t}mHname2mt'2Y#9Zї=U(͍Vpi"%Be4*NOFTstz.d`lҏ#t研T`s&%QLZ(j`Ib|L|$tso 718=+j{JH؋)'PQۏ\%YY%"(#4.|2l'"2/Oٸ `x 3ZYcY%y`:洌2Ǔ@vgX8Ԣ_?& `H7q~#[ʲ5ShMxn  sXh,g "R(7^ygr5X Z7 <hr;(z:$7{FJ8vn663N&$w%9G,iH&7bwKmjIjnH?VɢuB*mR0xsHigXDj*BbVvSׂ' [;q\<0s~R2\RI Jq-Jg4t1pn6ݠ0XolU!*C@mI_.^?r'SO5+C*'pke&@I AɼTLN׽Dh|(բ'`yݰדbG,w%qסF?qʌ|Gӟ)eBuΓx2&h7 ɕz;N~DIo(ldGX+?뽝1ue gqLQKbcRlws>~{o?ͳH.K>M&VPDp saNpZ@Eo8D:PҳUE?6z ޷nxc?G2)%nA3 k>yq_\yL|_>rcc2p27T 9jĪnfI*֌.FQ6. (Q%o*_oʈJ5غU|KU.˵.ƝvM9(Y ٚ\%C/,WrPZ":efq.L?gכa@t8/w t]K`*߯Sb񶗼~^8wM{43O;F~wWS~Ɯ~Qc٪i[fO(iRzqu1#? :9yw妽;&f/dž8T]RS/+hWR&֏)VrJuuc=J6yCtK7 .LxdH:Y*Sm-AWm9ƹaZYD|`SPgYqDuC:Ͼ:6`! Fx$!/d#mIͧD@R{1L3gMi̘TOe48lBf45+x^n}mއ>Xb% aeQ!%X;yp09C3&w?a# hyP>s?9oֳ $0,2@$%RZR?)f5XuɾVj:M[r9l_a+YY1˭ lyN=]m9]++o_Ќc ^򌌈A]!oRO@k#Hsʤ([JBxyP0zUVq(SgTHXP1g&I͌Ό*qam/ꚹPu\x4=yԌ79jbl_v6nn"AjїA:{ͩʃ $`k@(( 9͹VhsNT *дVƦ=YF(Ie9 v `!Мb]E> +U͌]팝nl[t4uڝ{:M:I"9`t< ^GAu˘<`fc&b1Az \dDe2;F$&x$1Yq0 zv)%˘RCJ XkW!`CjeĨf !ƞ3uS'cEeV1z +k&LR#1`,k#&^.7Yˋf^/vx lPKVdȢAQ@u`#"<`=Z.džzYNjŦam>$(lˡW>Gnbśmf ݏrdpzwh2a#0JL,풧Rcgwiwiwic>^fe(o?vϛ//J{,vVF=O^4`fQp-[U"3g?Mx.Ġmm'Wxh +5dyS1okG_+r[YloI1J2jHzeKl%?K>II3PR۝ltNrwf}NF|1r̼tz]n6nߕrο2 "gê}-˱\[5/^lTzYeBMko-U*yգ0 ҏ׫,j* 7u6OG ԲPS)de]}ƒ8 ZR&jnQmK{K_ 7R;zl6kI mDd%KF}͇!X};XVh>>\0,MƃpvH^/w=2%84uƃiIVpbJ S@t," 5̹-~ f!yL?c_yi™W,T>z9>=Nrl h֓QR,Jý&:y52a{trzYv'XtRM@* K24:;9]wwޘ=K3u@)zhjBJ&8KX(=gEo%xe0EU#`ݞEu5|Q=%ݢ .cVԻУ7kZu墂vBZmuiBk&k2tZb`0# g E(PιtM]5:-n)q_{n(c}/m0mW:J{vn 8];C֐/s.~6>T }9kg2vWjةhfJ2Ŵի5XsoE,U\Q@Wn%ZUΓw}K̫Dd4֜WW:+*V4*zv'@VxSV|ec MOLuE[;7-5m]xHHu:TP%&$|݄&5bMo[!ɢ2D,ѫD'9(LuND)Dju>ޑ%(=쳿a9TɢG D0Y7pbhO^㪳m~VN9ϔsWDnE&5KA.mY h*n(햂Np)Rm]E-:UD+t*l5]}bK"[DWW¸tQN:E&"U[l'Ut(N#Ie{|Wm+@+8"::cZDWppekA@KQ㵫N ͠mJgu &\G惲yV(QʟūnYͱtolz2w;_M8 nH,q)hśWy^~UޖNwqQ{4eٻ6r$W{͗⛁].;{3v03&d1"KIv9bŲ#YܶD#w"Sd/ZX@}M+2)e(9-Iư, yװ4'd<DX/'(/|2Rǜ0 H"rH]1h?Z)JkYf`X!X ئ7 Ƌ72e=_̌XUn>;pȼK[ ge+JמNO_OogηҢi%/fc8sxҿWo̊1.znH0~Z>#:`m\V5{j;3/zGC5.I0MlVң*"s<(KS2M6ePY 6JPB"SFOP%bgrF6D :6nQxt>ʊI @[l\ֹR6EkE5rẝw%[MZsaOp!})RZR 52v&vd쎫tiƮX:BG¥DmӋʌ׋4ٞb'W4L6wчzF&LEiAYy#*i!k9T}.e/J&;EluaVBǨKv};I`K"#vglG8solWڱ/}`wi|QY"36tdvDvրqr( y5SL )c֭(ir5DO#c_"SMcKVl{Z֬}ͽ=k0dK㻳x(qW),6L8OGmG%k,fpڟ^-Y˾,8ĚN6]ܑIp10J1MM헫^=;_Ap^[>-wE%¶i \Y5_;Q*z<1' a Y%}c4JkL+4Pǯ3qI{S=RNt2C"M Ql-hUv6!sCNdmnW# eQ쌙;&)d{_Wg;wfnn E6l R K&<5341 |պkǁb>ŴJxA,Vqy),-<;R:w=#:9)73m},P2{Tâ`DkY6%Kd`D#ye2~l AuP\tZPΫ!<Pg' ]CJD*&r*N,`g!SΚ+f+R9H@N9HKZ;3udE&&50TP B۔T1}D[I c;L{YkW|f*]ʷ–c&50l#BFi *\$ +`]嗢sN?}0(MA AhT,Ȝ| !AETBb =PΨV~OZ~o"N5@J)*2  DSJW %M+qX ^K mg:S -c2vYރ(v|o$g,3UBL9(ЁAea=t:CĬͧںa.\6f׎TdWS"!JTaIGn7Y-4[`ݙǒ^iȮ֡;5N^Lp9=iGEf,K7FɵD-qB?Ӌw3_si2h||?~ou:Y|(66@ ,5əA2'8br` Pa)f¢cl*X&wb a#5=z A&auiuwӫ|7O'ohBnrO.tOAzQ $o%LJu0.rN6ǐ.x=pus z%n"\a8\Wa.xف1DH.!.b;QcW#?֮SJv3 d/?\zE%0Du~S.ERy&"nK8"ƤW9DR)588ч=ڐͤLpO֗sNq:۬-, ) JU"`|[AE'WteHZFCQ'~`ޥRj`ŠŒD.dKwGxy-V@pRYIȉ逯ԓʘ{oҹ[x?Po?3.QF3.GEgS-ʩ7):#򎦽чvo0cb$"j sLo'ufmON,sO/x6uioz:5? )ZsqYNG8Y\M gG4_o!fVa"R=иj^\ZCm;7;|j U|)e}{҇v=]αy/q~m^^D''ӳK{|()rQaicNJz>5j5ܛލ6͟ӌk;]W׫l0ҋ~>ÏaߝV](bUx1mg_pٹ~TkMը6 Y[ͪ†i8j1{bGբ'uӻgZE#7պlanY.0FjņoqE_LJΈ#V >ZwjY'{VX8KSV?_~W?T|_g)%6,IWTWUs}Tml÷fe7{=$m?-ր~ak7WHv.:]+p=̽ f~6K?yT-URBg4/]]in[.s$ovGʺP |t%$!-)\d(ɤbψzHΑS8,0owXħyTO>쌋9jP)Fh,3Y PN`0!(\RPNv*3BwYy)]C 0KY Ƿ/4v\1ݴtq@CC  Lc-&lksvz#ܲO$LIƚ=K 9Qֈ(w̴7@kwۭHFDI@>@B,:şR ![ Re&>v̓:A @Bm7Q QVNqAY,|#DV7vL :/*rCx,pWxHh܍1xr_tM L;r(M(J21cY\b(!QTQɦ|n'mSਈ,!WAVƁ<:"j4@9iU$OLcx7\Mj tFYlZ_= 6Q0Ox}P1_$D*Q7luY(5ea;_˼5džiWpboLʬ᷇g{ŪjϤK=Eu04x5l MĨQgID_]ko#Gv+IǭY 61!$5ϭf!MRRSdK=HuVխSv A^ɪ䕭ڑ5ohC@&!`'")`D椼 /ʙpSHM:#$Rvfpbk+)ufMoɝC>'>MS8)AO?km3Hޠ8 ,@[;"E͝t٬;IH,Ӥs)FLY8L0-,Ik$zmC4:)ɘGrILN&Y"BYʜ.@fOu42QǸָDphiJ#A0lL-zQglM%˻#yBݹy[c9=l|H`ILWOW91ıϿziMe6PLF-%Wc>DGtX{:o8g}ƨw$Xb&ڨqcVm^Hv6Wcnuls5ms믑Wص}pB4QBkO!E ^ ^XPT#u&vlv?S(N2zіI"I2j Fr4ڰ^fݰV4QLxrO_+!x6Ci]۬dwSƯHaB9fyCQ|0cBBBpDE) o)5"9P*FVG#$&N%R /Mdx7&g}TqywYm)YoWt,Keq"x *h!jEmCap#BHlID~.F3ey h0ocy0\“f#˥Qj Aޤc؁%L_G4kgImbi oOױG7P(~^aoxb7.zI.q?y-|m~~n5&˾ԋ ?ùgu/,Z>4Fә٧+T5f6_=kB;ҎRRI 8+aMkNG0 bY_Y*pɃ0ŧY~jH)B*S8L`s*H|SFcj}̐`(A +H4; D)Q \hSLfLڌg҃/T}ȴF|xp\j-N2=ejW}Y?MˬBbx\.[庎9h䵳[ۯ^[Qܯ00{_׃VIew eۮR(f˧:eg=Bu盕"/~]?m3A槈bq%_&@XA캏M f|(+sXis!;v;[rٲ# vr.F);^8.No ]zN>C3jbD:/v&Tx1{d_B}HUfȮڼLy;&f"Tc"ּ?H (djxd:?\ ۄ6oc>s5Ӳl-T7,x4pf@!>yGJ=“uR 22BeKQ^QF1EEڪ{3ߧg$3je[xλ|fg&H6<q)R4 I<0'?@Š5ЧOY1kL7Ua˘,Cm\'u'VkQ,$Zդ-YQZAgRRe~YbR h \eqn \ei8wB)wpJ*0DE5AƴPZعJ wpJ)͕i\ WY\W( *KXWUش+j \eiUdW ЦgW(5p ypT\.= ZYurzĕpi5'z4)ř}\y9)7FwKob Nz$Eo{anVC}%o^4ABd>Ceʥ /UQ&deZh9r6@ϩR\Ϳh=U>9SGl;Y(ךY +^p*m+ZcbgqijRLwcb3#-+X bFAWɱςJɥ7"5pU-p5 :zp%ӬMp֬5p51QZ}kXW$GoP` XMUA[*K *K)uW4ӦUp\LW6^J~v%X[mz6XAsT:z)ԖJO-uYʷ-J(pQi߆?bõ͹?$PIxdROIQ'PnP+8 p j@ahF ֖Y)Üu@u$(N!sb &u.08K I /ql*A JOniMJb12Mp PH#(O'cVZZ͍TE/_r:]JIR%A(% K"TpLJDFD 6=Qm>qw.q$+¬ݚ|EFfa`wƬ # D <}zў[,+ՁYhDQԩ/OTשt.ZT3<3.B3#j)&r(_{T%y=GKG?rɶ3 ꌱ[ƵG`&{"Σ\GuWub@K)60XUK3lyr q͑;ƔH9 b~F(oʻWץ{,ͺFW:[ X*dr2q!8<~ùyTU,QgVd" Ι1f̌m6XlN[eK[rZNe SR+`c[sX`Bf3ڌ֒gRjR"jg P! zO#(uv pT,֓o/6ߚOt.6apf8 |vf Vn)Ĭ\9L,X#/bZc;T{->j' Œ`)OhZeLls)bo1rf YL.*8:;-h൰N<_vя# x޴ h(NpyU5vJc x y͐v64>VB@6`5 1םY{m)8e?R`tQI&XCMiۚ {I` uhP-D'v^F GUZ;&?P̉;1N&j7~\* ecC041;[,:P{V?-v eF|3"9\frT 2m veOk~ VfX57]iڱP14~K Dcha$Uxᢩ+35gSjU:`AŚ5 e00<Ӥ<`a.ian-N wF!KA4~Tvؙ|6¿(0w詡MpV2[peM{QjÀwFb{@?}}PPѐt]Ŋ٤"QոM]QՉ1vfH5Wٍ 5bkE$Z([` z3j$ };l.4dž3ҙcEєF}fL+e[u~Ŋ4dc6Cc,@T yibE#&dLOy/绿k'>DϽgW0} +vw)ZXI| (0#upqҷ0]JsVbxu*3l`.bOb|MŢڃkIԆ&{Zy]*C>(q1Gr@ 4/ c+ܭo2x,m`yKEWǛ:Yة׏]ߙ%72j@7[ F (b]1rrM*/ɺK'08f,MXuOn؈䰍T~wiLǯzشA#.&& 0'M5r1M=bQMw 0-s)hv#ޱ,$A-L/;PjԸWcbiSHa5zf{5`?ߦ%g |vk>y+Df!꣓lmM%bc|Ϻn^a;~L{j|fnΤ{ FCm@<,܈GCǖM U1fp 'I[cd@裤-LXt|n)XS]HǀL{7jq&@zMw=<ɷdݐLdn7TQj蟀k_ #>=4t);}Z7DDynz*X:m #|\DwTn=Y聕B O pc; =KFdi0j B'c)>03p+oA0 pi|\,SJ7Km3.MY$bS c5 zIy?U7#`/хo)`z+X-]vt|Q!hBOA<Y`j??>%Bj^y^_srXdJ vH3ÝeŰo(Xy6{СFpi6Nɸƽ}6[snvޭz]xkOZ~J5}~}\w|xY#B dKXQƯ\\ .A-?hpKoxp%3-$7YZDG\e6t@$;v rGJݻ+Qp8>;N\83NKgvWt\2}ʝ+֩ vYo+Jf;mV\] \!B` y\ܘWFc+QXqu[]|D{Ǖ$@\T.+vȥeܕMwΧ%⊒\2 jcp{Ǖl*&x.+f w+yTq.⊳1)/+DnUp^o_"RBy\Aٟj?M߽.W9sLf!\A0/tJ2 j~Nqp85qu`v~\&ן]\ ҙ=OW\}ԇp'e޽g7?nk ~՟c>EVdz o//RykG{ux1ٵq;^ż57;\ϡpǫwO ʭ۟=\W/w!H|͟WM g7^}}|?ɧ ,_IoyBK9\)>Qbo_OÇ9$!(^WţJ6ybRKJ!\Ԣk烙u:@2M:2f ֥w>S|a~9r6Ch&*|N\Y&s']c$d[~LRrFC}q=MܪWi69[T-q!\ApvJz Dm{Ǖ$#x &u~\@{Ǖ^qu"r ኬ7eܕa︂J$Ip-۳$,+Qxw<7XΎ?''ɍ~Ij9]2V{8WAqSo1=_H'ǟW߯>_هZ}y /yXwׯ_w7 #_?Hۥ{C\r^mUH Ľ*n)i3/+v\\ϫJFw\ʤq?xDُ>I.Y D{Ǖ'UiO{,B2\"׵ذw\AeK/+ l+}D-WRqu]p2q{W6`XqubWS2n\ܰ'w .Wr(I *Eo: e 6;2R\] rD++ +wWro*+zs&E1Όx^\ypuʼ3\ "շNn*[t]6 N ]_kO+SC˜qR;=qJHIuJmuz24.螹2'8=+phx[:[T-؎wFS'y\A@w\A%qV\] |B`O~\\Jw\AeR\]gڜz;x\Ȋ 6.+i!7+QwxsUqu9bP_ZZ\AIquLֆp%: :WdPT~J|ŕuܕa\Amtf^q*>psc8zqupsrO/= ۡ{ ZOW__vwUP誠{OW9ҕpWL0XUAiDOWϐ hd?hS퍽Jzzte)sHO >W ]vP=]=rB8` L ]\Uj ʽ۰Еgכ#&5OȰ`/l=T6h'jܳtezڵ ;^ľ赥}sυ3Z []n|Ü,֭{@KoˊiǾgSm(J eo﫭T;A[dǥ+AI:oy5Wo߯J@W<,nK?48?<=wAkmOq@nhR,EO@ۑ~v`p2U47?M_o^]O_[=2C|ѿjV 4p4,Qr,?5_'2?j- Ԗ*F[U3_x= p/Ln:yu[ISpqwxCŵ/fh6\u+\=Ng9i˖o&3\ۺ:§n}O&c\̏m]noskUS qz!'Q!IȢ5 G(:C*HA5;͇mP?d0bg##*-#.gMikdqVP'RqT2-ZFSF,2i#)6F%wY. τH6FA '!ʎ38w3SċMp}JvEyϋ=/f Ez\g t`e4HΓ%i%1*m{˞;}C؍VYa7\6Sz4Hc:ѹ$gOS=’yNÑ5AMNE VYX>'Yv>_^xZٵ#1@W}v !nJ0$홲18.F$nœbiP9:3 Pf}酙`H1Aڢ+0 tt-O4\G.=qވ$eq,@1Oj^je Y~%)egs8P! CXA0RIIHQG1BA\ "{ng\bipP{lYe}$=) e!]t@&S z=VblBvѱw8 :kps7"2{WK7B,d'J6UIUk2qR/*7rՋ^$/^T/-GUΚx8kbۓ/>x^̟M/*\=g`0>j;HW)r9_)-!md|$jXD)N W9ũjyz?eɀ>QDgqj~Gm-70_:&~.c~?{̳a~㯳Ka8bUn~?oL68L]oʽZAPpM5Jy5`}wR}M+%r+q1~tΎޫLD7 Q38^[嘋_6Ǭ\}yt <] ?݊6&H?/IK󋽾Z NN%ֿzLqQ|xvF)zdMYoXD=F$WHՏ?{ ^VlZM1ꪯ.2d}xlFU#sTbG~6=O hӌAM vy^8Gi%@y-5g\yƴM3%S5ZR&(Mb:D,(Ү$^DPp@2 uB"3I%xN =|ާ{JVApNKim59hM9IC((1c2i!\AZ&:N,K 4[C'~:,-*#2 [487 X0LX,pL>DaGpN/;UkuQ}2eW*ъ1$ *A&4p<1Cީ2>G>6} оߵ&U`4DƤe-9}Y bPe-hTp_VPNX޾7d*eJUZG)ѧ3WTj6jh'O7$xܓw/ik{%]v[IҭhtQ*K̂ېgt4x @nER~ېoWM]X)dek4,-]» ;NΆ d葁L);"N`>ו:M]Lr3?ݮ|g\hJ!\vzP'%K:`5gD_MW GXE1- .IuDiE*z7Agښ۸_alUK_TLU;Ǖĥ@j2$eٙ߃f&IJjJ)G p8G |x_A8Nx*5*rX$ ,g9V$H+M*n??>+\'D: Nyfv$xRrSMR)dND,#2'N-{.mq6HURDX\*1m C(r@8`2􄀯.eqn6eŹ"tWfze>'0 okf0cȞAMُwc.pY>{,Q5,P^ySnaxQ@䙓: \uU~6(NJV@x2yfq9]uW5cƪo]9`ׄ DE5M9B.ݛkٓooWa!qre/.gcrlWHڏ+;^SK6$斮aHdy 1B9t`ż@OmmUF:dS}N:HnXpr^$pv~GJ5R9~VY=_~woݛ_o{wN9_ppc8iA?Z%1N|M[V޼in>M6{]vqYv.[bEhaV ?) |ȗ-]Yq?؋ظ̯[7Uq;N1À@Kvi#{$q8Dyhy"by _$$H ܂Niɰ 9c>p# ?9>8B"BR L@J/&-#%i⬴z53Xu:U+۝>kjVu;bÉ1VMw/eW Ft$ [LD+V4sF`2;ztYyOA[K Uօ@}SN֎xdT 1^S* ;Ǎ4ɡ'u4qZ0-xk"&DHFd\T#JqqRkokݷ IQlx9sN7ib:p{*m: rI"F4r"I,e&v9Q-rD"X8RLj!9S1H6\R@ %' epԚ8̂O@N8FU'~*+ذPγP[1̈́}Cjm*˼#g`P\Hƪx⊑,)@ S?Do+FoԽE>J(J)~QBBoSabfQ?rZm%6됨b/XJ!=KN [Gkcsj'udih-JXA c!" q )mF69" DY>f?@ Xg\ƜT8!('kka4˯H֋smlk{asS3kϘͱ;2:3^?6Z3y sY^ ZEۙ+WSɑ 9.I[v0Xa*%P1Y0JD|$ĈqV5'gDjgۅ(!j %ՉKYܔL S[gO:VkCDQ7xKW}}RAd,E֞QMLtHfwT]W=M:q~[Ed~P9S#sC KX.]JotQfGо>UXE=[vީyYxw5>Ose2lbyB<ꎊplsz[`,heq3t$72?7Kb.\">uVvA ɜU1B-NwR7 87G@k(W ydADƝH%dP(u$݌ʴ{ŹwƷJ@Ac dcpak]\ & 8)(Pm"hJ6J8?9leUu#k5 e #]Dp$b]8= [GFR %@4m۲QK/Bp (1 e1!%Io~H6Q<&[u y i뾩H zXoK-Mz+7f{7PX.o7#Tx[6r7W?{7ˏl0YMwGޒĘ$oힾ/bB ًᰵoJ;HLJp-3@*\0->G8^ m)8Ϊº"ĥQGmHo\X~:ա(k@-~ / tnz&%xQ({Y\^|VcwRj9BqG 椰 i>Ra"!mq?HL t*R!I`^H"w'֋t%PCJhi .CtR+eB(QD5"h%TIn<ŎC θW 3h4Ik9CAm}<ɰAX{~,lU&1gbY[zOg"s8'+FTBfy{TGu`i!!'"BgRyw.UīAPv9,$ JiSrB’"*4gsaJr!>Oq>IkyZ߬6~祄վC=HLX,:Lx5&N"mSƐIʃ:9E40jD&$2Y:y|^b^2;;=ݞ7d4Mtv^[9K_-w}*F=wq+$H`>F$RXb]ap+4^FbxWS\ҡC[1h#t研T`s&(&-Qs`8#XRy'4" abXyAo;roRE˱OGO6!G.LVc%?fLA%ih\$gxk0ȺKk,QY7.XM#X 2F;Ô`0\)M-nA$SxQ<2vQtigA G;CƐ6H4\cK=Aw O:cRԆ86-k瘞QI&`z@UpuU:Ψ'L:=;~8>Mn4,#a´Z8dV?t;: #xS#dSqx>GkRr¨>'T'ه??~LtmnK3är@bЕ&Jk,`fD-{eߟɿKxe4'7h0WYݧ |zP%b?s*['*jTXt8UvxV"Iy6kP.+̖vϛj0z?6*!rQe}U %ʨzyWǺܴb;z"?F-4+v.Oz.? "RŮ8DHzuCe]J0lعwE ̒iZ[wc1LȤWpu#sWr}y-_1{[eM.GG7#:P++!,ߊھL 3Apxy`jºjػ#U-sv3 0?;,.x`܉ W=2v.*2q[טly)3!/{fqWq@=_ғ #<9Jnf"PW1>Hc@ȹq;~o}r\vF>;.Z0 q/* Lra V`Bdٻ6r$We!;dqLO[,iՒ`[O;-Yr:lblV}"EKLL3e _}\P9w҈F8lOF0@^, 2eipS f;`O +֓ iD`HGZv #@ AD `kżJ,ӿ ݪʇNƗe7ۿ-4zv"u,!12TSN$@'cRBڊ\#Sɋ9xqTJ6rÊBr7LH3&*yb{kg} IqZXjsw=$ֳ!o8٬94|0-ՖpaDDxK8ٜe[zJSNJմ>\>6h%hփ0qo"jrB9v/U{wﰏky}e Igbi҄K^8"4r?5/kZHo,[?.f'ޫbr[9|s󏷣I\Z Ԣvp{G5m^'{ǚmHկH,P!=I~1LzN{j&SUXƏL,;ۜyCŤԁj«T :(&rnIXRɢNM;8[RV l"_2Ήv+4z%L< cEj'ViÞpP~ߧշ+9tQ|41R5e;d.5yd]Z>Gkcya?&L^zU~E=w7_ Wjt%)eVWެ!ۻ<Ѻ6[W}mGgʳ+ʃ⢢\sBÒ F \XRXo"sI[ 4`q#F@{c` n m4J+TdL :HLX˒>J|)5{3KqPHЎ)M Z{$6Jȉg= ֫}jV !A_C6Q$w\P&!m\ʦ(KmCB)&ڇH.NU}"3ހ,.Xj &0$sɢMt Bsi!>0Tkj^'~ByjE CP:.ꙖZ˙-gW>V#DRb#L2!" 'Sq3,E{S1^pGڨq]uv^D3 |诫Wf<8kɫM9n?no2LK$-ZL DK<]2c֣f<ĕVڶb".cQSYL6PMGM3X%Q|%\-'U+`xs4CӾ$'A'UX흹nHY)gEUewq3.]F˥1K ͟iyxG%4<x?b8_$?^`w0xK{kWq^~Vt9+*tzZ3sH`2~MI+ڮ㨛ngI|RU.qqm{]la_߱ O+KbJ&R}蹠ng`A E?=Qs`qO1>=JD=NrVٚ{ѱTRPJJt+t:@Q2ȣdqEڱD<IDt%冐p5UHXs!Fx>R.*b?x' xfNyhjҠu% !du:b,ic`tϦ1bF&qVh$6ZO\m`ކx(3@1iI7`pNVYl/U3gˢ'8 t1 E鋗.soE/MwUU=p~wE5.ϿV" 㰠s?w?Pwz\˗ksM*"4[{]6zzCYN`_obQM^!~\WY΍VO}җG|J@Q^fZ!Ji>jWY|:K8qv+.ױ*D7uuIU ,Sg*Ъ4 m@mNZAaӤWԺ_XIwJԹ0|Qᢿ3Uѹ5f_M}x"$/~Y*0/v_[M0H*WeC [sjB IB&$*&;t1ZW؃rvoI ࣊IH2'tgFٗnP̺Cn)6CFF4zx)-8oЖégy}=kV_]o%U(lgtU 9;/%ShG0j&*L3cO}DU/oǟ}U=WEu9RŽXTţ@~(7war0Y&1O[=NLϸ<[hE"Yc^{G$~()BL"oQsiDH#!a'Lp4p84 LS4ւ =/8O)$8A%1zY',&8F54͉Q>)~"xS,=h.F#RAORsiq1 X 6ɀn'U-e>ձc1:3P)k!h% hr3ǥQrhNXcGc#tL}̺ƾi]{í #jWZ4Zȝ69krR؟8}޵6X=zp9g 丏Y&A)L DLMXb9XH2L$QVQNYKz(3Kg-xfFb!yҮoMv1\[5 :[dfEl#M !CRE␴4(yG9SݧɃhF{`!i OBH…#qv;籌=z.G5/(jGA.- $E&9a)Ý$AM'``_9!rB5JY<=YA)Yɩ"M)dND,ɡU)Nح[H_⣡bs<$O I0*Yч@Ѩ#G=gcaf ["t+\2m哀8 oa1Sg\hQ|0Yo5,Yi/ x31>̘E,hG($o-I3+ezB_*/и]\{lrFH/8X \+] DE++.*Jz9>T͊5guLɏՃﮦWa&qe/fcrlWH3țiu–[^SK6b{KMͰfilfuŅb9t`ż@Om/]UFv:dS}Nzܰ窯&1W[z,WȎJ%ΉqG8g?Ϸߝw?}3뿿>{wG`=n '"{j޴am5͛iZz˪.7{{+{k8{s2|ߟhx#խGlYh3 0 |]-J*C9:҈!Zr92Kv~iM|g'Q#v;)*x.߇N%a|HIrhy깉* 23Bpa0܉cmRI&% k $qBZLR"PLk 3%XwʽQlmNXwyN/aUGy#9PLԝKo6?(vվn|!4ʊ c.(\ԶphsFV:&~4t/&Ͻ{v7b=%Jc.8 ) %Z D1^J$Y˵pAg_nB-ME Θ !ܵxa8QHq*rU%#9i4G)=HQ4[,{,iZdi >\CkF|ۙOtx\G8<yL ;qh JXFLk^%pֱ$'Lgê䑬A٪iYV6Q4QG C8qr`D䤼 .㈵U`ąsa=7W92& O5Zks gJ*S5qM-tDI!.x19ڃA# ų]3$D7<(:b1c?ލ+h$(:$ NHQs'&txpeB}<$KP0G Fˤ!do|`\9c$W]IeP*Q ٯSL1.BPDphiJ#A2lM3n8gvxgAy_Ȯ'=6nK> _ontx2KQg<S 3[x'r #ez ݱ\S;z0%ȐsiMM.`.1-U AՄ"W`QOva(Ebh18o3 RHF%S69kxHl&΁>~uؗ@@DQ6xK[}2e:}}uOux rWlzCbjλ?/zXyL#39Odkmdyll)aK9o#bKmCapBEq7,_E4y&&yZˁEL+G KMfUKrA=8_ҩ/.pn؂pa+ͭ7\bp$)Fa1Jp~96T^:z~u(CY{Qiׁ  tnz"LR`H d8sU,}T̢/Z8A<|-."hai)<1 <0P׋B]'ֳ\t9`-JFGυV()5-&3BNmx`L*ab3{w[b[f?OE_yX]=lzOCuB^D#3$e,j7OpD[a,*ZBC e\;;cum$q2ES6ud<"*% D0Th#$VSpB>srW<(00_d wL\q:JH%0KqϔG+*e'4%A!Ь>hӿVNJ1isD9D~D
\]M#,?P`D9[s;|dU?њ3P(N:-X rD 1TP)5a0c&QC>2dN]ӓCzR=^rxm k=8Y7ڠ!934h&5э*s^ OQë([㦊9獣X^[C\003f7ՠamU!bQe}U %ʨzyWǺgC@36% Wvo Շ"ޯ+ޫ9]8ʄ`{٫2뇊]Jc1L<8m"&Zrٲéévsr,)W;򷷅gM\ʿB{[#3jl'? hϒ]BqU7/<'oL^<Jݓhn|UvJ >Hu2y:(e)apk{d6MDo EWsÙ,D!y]{"WG?Acd)6r@PcY3FWyuMQ;~o}r\<#A_Ϩ˶_9|@B$x ƛ 5kĥ}10J︐ʘYN[yqTyZw "u@ ".Z؍BHkO!FzШ) jC:.2XH#n?(/c*XO3&&p$!cwqHNHVf ]E,K4w}!ˉZD4ҽuɪSQm\"[R1Xl.o7WcER }}oO07[KӚdUԵ&¾8jL6dńFD镺r?^bɷdO)܈Dz"s.s2ɓ#x9#Ng( vqqf>ױv#ƽGW,N>r |_B/g(2v}Dzf*#| FGA)(~>Q"hC )Pǥ8RB6)V@֚nr ]4DZʵTͥ$ܬ^mtt9Bs]q9Ϡ6Hk2b[S5$efZCs7IXIHYj (X furN3pWc]zr \}l?gc?3<|9n(-}݅O^mHů;t}'GlLAnBJRȴꢘAZ)DR F4MEl0^P)܀MA *:sp][7dĞfa.|6 vw<NG>2 pqߙ:ycqME{#.~4иg8yה:#dԲ#.~\|.x8; g6%D69hb rb"kc)N_X_m:{qFZ;ԑۨ9\V`rDY))ǥǽsǽstmemH1U\~kFL$2)!ާrH9$:.RBe "Y(vWJpcן9̝ЫO{nP{J߭WfRʙ7nw#ŜI1)f 7lK2_y-nzSIi~y}];7O.ۯ羋i~u? ,I_]MN2"wgV^]]=^Bry/xxMs1|)>16出|﷗64mݝ6zqɮvVNo/oO/7UOF 8goH練ތ.|O7bd>8T<͊ʊsk_QWuI~#E;rwkƥ_E)]YU}_=e]3xoAꮹWҒ,Od1!;yF'?"8ozYzvy6SVVo .cޕS0Jrv;r?F{I~J^~?}H}pHNcM+=f0s [1RL\C;W栐x~/ah笠xO1M/ F_06_]_{oi b>.fet~ 6U䃕7Ͳq86w4Ly3>!6z1rL~4_I-;=j&r{>Rb+֗Ğ(Gުc^m&ƭ-19s}^B&^Wk$//[)y?U:=\l՘d[%n,c6Wd~ }Ż[ZBr%CF,f%26'NB.:FV-`DٸV%Ś;7NkytLkgZ4m?Ց͛q%_]DkqCc \]PQI܌)!&Zwϵ=P э9?cvQ6T^˦2 +i1u-9jL8V6c[};+Д3Ɩj6 c;7 0.;[JGWa "!|c,ш*\{k!sxg/jQL80"$~$hɗ۳W61Ĭ֩޴` %Sc)7¿!:I!ݽE*ٻ5D,p4ͶMXBOdg[>݄CgS3*em4@5#T~\`p,D|mNRΗ@j2EEc `jZcC qȎޞ6dZlGN@֙/ Jdl3P[8N,:WmBFp:88NT7``È/"@BhBV;Qꆒײd,qCY0wk$ )%2 PL14[5vd]<2+7lXalp` .df hqtD36NL'-`JGh,v6LA #eRձ} %> ! 0n`Ɛ)u1JBڙX.{Y`h&Ď\548ࠤ`g‘*,+*97ud&Jݎ XiA{s ,b %Rah,T \!ԖPwNcx𤋮 D}۴< FbIf<5Ơh߱sH V)q 46ЦECQR%[^ZeL?Z8M^&2ЙI@/ƺ7XfSAY$$'1S BK&wfMcJϮף~X(_ZtU 4=D ncj!hAƛԁq>7!̑J3дU@u1'z$;S.edȃVK IDNQcqfdl,\= %f A 9ˠfvK:WAq;2q 3.LeS Yߙ}ѵ_E܉ipn۠&*C5e8Tx.݇mܿ8;﻽]w-U4`$P .MAݤ-Ќѳ!m)m7໳"Hu44kV7ZsLu"e5CY`˳Z Io;f]61h7(6 %A9HjN"r#!B[@Dr )RFT'6:Sc@zC @cT O"8$o6S2trsp/ 9F|GAʟ:+Հ C GePqLWh#Kcb$)ց4h5p?{AmLqZLT19)Hs<"re6f kj+H͚ F;N4-Ԃ@&Ө,dUU=Jve=YcPbrcٻ8n% {xݬsn'x@C>KKetY$ů.(%m*#=^@XKBoж #Q̀; fYt5ft聭t,  ˡ@%Xکŭ:Yl/E]jl& >3MfO#`Cn]/ q18- `ktw^Y*OW8hw""x"`0݆z@rt%0{%e W*E”y]Kdוr6eHN^U$ M$n^fn{%2{{Vs@ 0gq%υ 3@%3\!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 291a ŵ0n<\{@!* L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@|@u'<&k(' E&>Yd!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2>1&;UN aGgN`- t;qy\&Э8L[I){ZL v &C&Cg L d=ݑ>#&=d=<;&=t 2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&Se@7B)ڽZP:>׷7@CJqpVM$SJ`eY^3Y>NjHoj T=)*'5cVY\J{\O_,\° oQĄFz!&oQ I@ }wQllR!}>t<G.Ҵa8N$\0m_ѥaU|x;~nтo^T}kiTL>V![xEo?o4. DyK\_\h( -`E29jR8]6e1y3Hے|!U ȁ=B39 G=.'jP,h R*+"3(]Ҳ=j88;Q ۨ% ރMQcxb6p_dy jsleCCk0ƛ]}Ҩ[h$Hh37Y\ F#!m3p>GQ&3K#u27nw_ZiEu9?7?5rI$砹g0ғ60H5^)XU,LYYke-9j`E.(@?ВP6_\ͮ8!, _z$T&w"l2JDZH@KF3֜d N\ LχUNEC6 !s`~GNheX{i_y;JꐞzqKj11Co8M||t}DjdZAuPj.-z.l@9:ooFyq}^ۆ{qOwe@v<ܝd6RD}a2d3W(Jm-(OX_)] d[+( ?Նvr g7Eؗ%iq׊e}e<ۛEw-l\K/29|b{*su4IKG+}-OdkIS%)Yɂ \Lj(,Yi#>K9e! DAiߗSPI'd7;YVuVUz_rC oMH]`v~uU'@PrRnъ6ƒ"q vu*n(#1lq87ss7_:I o'+5JfZCr!;fsVVon.&_rm 0 .MC=Z#ʽk*˼[%,щ*r!1%J: B|O'Xm> [ۚ"GRmVWvO147|t>l d[T}P ;ܿ1D 9=?N 7~IB *yفWt"XU0cx!w ^^$լV ^֖uK "FCPm'"(@MyF+ 'd p&NQ E:F]BO!xkI;^L@3G{pY'8 h*ÁX͋ BG>4;>Q,ٚߥ޼ץ61P&i _Wf7џKQ _mwdϬ}PxJỪXr8)EWW)3l3|Ķ`kv^>nJwׯ8vޠM\ ފLDɻˉ߾7x;gf駷svGu'r{WYov5fn:eng4(Dӊ;=OIs^2@9G OdRvHKN(Ro{D2,?\ioNfY{?ޅ _/äGVIRgx?z]^=ȇ-UOU_a_5^hTr*\ڄw5NiMO'QO}~RyX,uis2+e)MNKdO'=)m:v2=@H8o3ۆ0TҦsHe)R9me$z5If :Lt6:R}]J'ȁ; fmAyQޭcNwxY"-:v6* yН|_<[?(nC!Ŵ |G( ) ɸ+!ICˑI70eRBu!P8O, !3R#:$& 1^_ T4ZNqR9n %6h}҂O[1 BiTS (y;Is@΅I>"y'Z"ﯺ ޾].#GBGH z+KfoTŷBe#ݷ O6d==EL!/<75 _&=13k+߽ m蛱z[ W[{vY;m-yHni+x0NR¨`0ZVV3n{Bn. iYE!1Q'n*ֳ\49`-K֤eNj QxL4qoVTG#,Jrkinj'ozm_־s)*CwggyV+GQ[.*)93_)Lb;!i$mR@JѺtPMi`m@]mw>ɼzjLlˌ{snGhR->]1~OJs#+4$x0/Q1ǜ\\/9A~9@w*圢(&-ٻ6r$W2sNr`woq.EFLn1-ql(t,;`lM7.uW=|]/NHAhbFomDLY$91y,W݉7~ۼayNBӯXBaְѳNh* ccPGpMI.xl RM>.d'UѬe`0֫}_ݿi ߼޴>Kft:<ݖ0qBTaI'v!Sr2IS]C (! Fq6u@Ŵ9F`͞Ӂץ 0Rcr1ڄܽy_7ȼn6bNo?_{rS<[Ϧo|ޜO znm891ݔޞ /r4d:Rt>Nl>L|1;٤^/t~lR<鞛R__w57>VU`Rw?]lL҂N^}4Lѽ8˗Fr= F{1fK ma~MN_,*lfj͖`r'?1rۃ<Ai7(tӼns xƄ{ϳ |3Sfqevnqrk[__R OjkS./gCIucy|> ![&ZuUͷݭyTnZi:m;xZs9m QWD*y93|)> 7/mwB*J"ٓէ1'ԤVJmׯےj j/]N?+Qg \ݬ/o =DNތ|-}fYm!^~dV..zL1=֮LqcLOYz5M7uܠ~wLc~ƻ ˕0Eu' m6ͺvs@~ȸs~7m I@gA@Ȁ@&ᒓ1 ?r2+эmߗgdӋ3jA7gilPyk$4}u-+V{ro\!#*)i[yN?QpNׁݣcqe@+a:e ?0@j;j >B @dcuFlbG걙N]|x9!ͅ~uwmcݣ@Nnyg/gD1V[5Hx)ƞZgtvh<+]io+_dl=6p񚞍ݱ/!<[ґR=i}1/Vz̫5'CѪw%NnV/=E #fS`V 2>dida0HL#:_|(hƖ`(9% [vD,5@*15I("T%K,bulf!~'dS#aaovjHK҅@Z&WHjlJ"/Ҕ̋3& m)FiJHdW)Btd#q 6D j3qv*KRZԌs\ǣi'#㙞/>P.p]u|_uʝ%Fd#Լ4,ֶX%%s-Z+QrkYֻsQx\c7|!#e_FF)B) qfXL3 ^kJEe[AW'W4ۧ]~\u75dar<m*On+}T@ ^`LRP/ gVM[yU4߆ab6ufv};̩v%DE$#v3qv#vMhl+8#j ./!2aCAfJ(*Xj#ED~]6*0Eo!̢()r5D@e0bHTֹMٍSCSX~FED#FDNOw8n؃UYdmD* {^02 4EJ3u}c2ZP*ږq&昼&%))' )teI7FfFġ:F1u6ӒCqqqŻ4){z6$ fd Phd1I"zcfq,xx`=@؎ѫRs7QC*{w@яoaܝM&MkLK}ƞ٥A)L +Epc cEˬh`>(IIAv2[uL3$ж2F2r(bLteT,jm$ T FzuBՏ;3B!W-\}k3C$Ed^$#lI1>`y6v`ٶȬIϡƞ}}ArW. ޏt%.?]%IͷRw:}r:OӂwAcK4uK[ G.F.}Bs1*mQBHt`%@$F*Nh%R6Z%cHs =I:ݿv.B;0w0\zզ AAnTV&z} г>+YBʧ.d{,D![Ea|AbqYt{%;#4cSRp1ӚXkݪyEwNkIfJQAtESSRup3qHf =µEں]=chw;\ʟ=Zh[Ӆ/nTufPZwX/o.¥&kQQi/ Hb!SA17T *\$ JgWh-MS5P6!ٔZPh dNEi!5D!D}h1e֎TJOK=bkBT3(ND9/tєz6 BRԑc .y<ɌMu ^4+蜴1ԍLrJ$I+ic\&t|)OI8  )KQX$h (7Hz/ٳm]ωL*|Q6YP;HhJ!عdUtEZDI3v]'D"۩w]eB/".O~elNuQ=pʫCO-]87[ge'2o{]u Ѽ3~gzz\ճ*ɳgݳכc .M.W|69K,x~>j;m'\} Cts4ۋO*.N_p.y?Wj34ԲRC$C{թns{{5?4wCQ;Qb ?gȑ/۶~208L2`g -Dny$9Nf[o[-)r[VL[MEvbjp{R2X[V5PZWȀKI`TnE= M\] >/这d0X|~"9 }끐AغE q5g4ey 5eI(zrMN.d…7:^@ߖgPdBL@}Z><~B5ſ8ϒ"bGr5bɚMf~ofM뫩˗;͒݃寞"/8 IvgPz5j1MUT旟3ǻbԻ@Zbty3&{xU^SVoJX0[#rů_Sb& ٫ĻI?rC5yƎ`)= r iRJRQ&ZG쫑D&u". ]eD.45QN&֢4EKb1Zgǫwc:_LIsʣ*!V^|]{)pq5'ssmn07A9[B5XzƃLc$*qRAAϼaZ:GC$% $#o t\R=j(G^vsX Io|RFC lJSHD$4S&ͱ#cC$EF~<سk_q^Mh+m8ƞhLkh?/>f14:{|\Vs"k\R9z, H(9Vt)Y)9/_҇ڐ„ػP~ Oo>[O/&_/~_t Xzف3/[7v$ ,I`"'rR /ƽ%FKfs 9'ONWmTVV&A){E{SQDTt_YSQA\X k<2]!Ѱ.Hs靳AG[@:V{֭jڑc028mx-+5t>k9BG)v(E"iHI$%$ ]Rl")tX‰d2Pp8@qg@ lj1lPepaɺK1G4 OQE$Sgti%bjbOg f_-8!Yp3˴#9 i 萒Sj&PQ$PTحH9v36HURDX\*1m Qpd _ýݖ/vے'D2ZΊ*Ǜ̖ [bfiij.υY{m}S?f=7U2OYNf=Zٻ7uNVꬓM6g:na6Gr0oF*)=`:+dC uIejKTǿ&o?՛甙p n '"˃0k?޴em5͚FҴj.N6#bK m/zq~V|U&ԳkX>y[Ȋ{t床E}i~faZ=Z*KiCZr޴siM|OǓ(vEXds1I2I=Ң@tZ2,Cr=t<ْwfG"RH$YZj=7 HI2rۤeD0M6RO&t`K\s ԚN<A<ϮE]z?.C%^l?A)uJ)/ƩI"NH-,=Wz%04[i{10DAYKsshB䤃` 9*0RRƹt[+Y[UI$% 5Ƅļi OmBG$!5vicj-l#[?"+M>O}B}E?&WcF 5|RR0NrL`W::N9, ǜeQ,O%KP'FldU2 $4MkO8(Hqmɡ,QR'H1_ ԀUze 2@r)+xj &uPg\CǛ] B?и-};z-Yt]02UcjT>>Ca.1:čKaCI S5P21C74rR!Db:Q"u;˳߈Ҟajk̃FIs[,%"M_ThN"i \ v4\]*Cyܟ=T=߫rسM ܞ=trQѫ:hf~tx }['ʛSm78>7CdS82<*9Հe%T&K#Z>^ > )I#aU&[ ԩ\'I[kj!Pŧͤ| n?C)ˀW2sIvi$ZLZG&SGk<UD*IQ˸e݌W_B1߃۽Iw%N~r_*KU/'uuOj:G;T8כni4l;ި %kǪ&*FFw kis NBʌKX.]NJomt^fK4͞  ͞n -ゼ[,Ų>ꖊ']W9} !9q(ҦteirvLn0MWlj[tYB#\rtZn](A[)ߝ p)vԕ"QZ S ]r/ " )W yN1r,1A 3Dx|t\*ΒͲ@C PJpDԛ$NϩxԖ eV1'ρHT ƹDlv1ӝ5ő [ xّG ; wr-ठ@WDD *(-PF~:leUtˏB <Հ֠t=bD h7b]T8Ľ0YOR %@4^Ge4x_"c$VKQc3˄C".4{|Yݣ2*n{⦪:,~yUx>h-kgb?{\s6|mcNF zAPy%TqgoqP㍓X6rwnM~n~W۪dםn^"+NO}ZCg^Yɽ{}[;۵/\Emϴ\޸v Wz ˏѲ6+m"%ܗYUZS#hBw8yਵ4֗ W:emVt]~ͶCZO$ 0CYuGuWFz!RҖhuHGK""[zmd>5u=+u_xNY!:2D!Ș(pos:%*EKZk׸g#7I+y< 8 f~^F$RZb]ip+4^FbxWSԇ\YCXKy9BwJIF9hrH/br5F3"%EwyB !]}`XG7IY')&Ca/GO6!I+‴[/{[?AI[ER%9Wj%;e2RYr8HڏPT1V(a!"/g}Vuv"< "`LX]3 Yiިh2 waE ."^b. UV&!CBw@@9nt1y!TM`x8PvJˏ <$ Lޕ :x+d"<@&jCLˋ8@VsCQлdP"Hy'sȺ8w& H' Z},gֆ' q7cpbk (}29˼*Wf@ KP@=qմja}P-!آġCe7  5R / ]0.rI:pg4D&$ H JHd"2BqRu\ʪ=O{tvհWkN^&MRͅgقZl4; Eu;9Q$/t~Fl:i~2NR3Op)Nx5u=ͧ)NO˓K ,"Q6{3} (gG#uā<'63}weқtvI߽%秕cRi_.8ןS4O'/ߖ? vU4g<$咦La8MR -[avȸyL7wy %*oԖ7RcԤ7;@nRV_NgsooH|knn\W9 X<"@ P`2oفկ!vמq+>a4Z僮]Mb=xܾg UwXQm3N?f-Kȭ6&[yfNI0MlSD[c!gQ[3}= -ZEtAm )%sgXMWFƾX*chl뗗oqY۫]_`krw ~<&27ɔlMBU,:,K(ahIeUĖVff(ƞ66XvP!M&ێs*\)342bWaIfb 5:݀ogs.!EgGCH&% YB[WqT"JnbR]!Ս+6Ut dm92šjEMđ0. ɨN&xkǞX]Dq@m#iiY2IydZ]e4JƠ%@&k2B/"bL 8)em' 1D,"Y2"VaD@XL>jd_\q8}#^wL8.Gt`2w:O ]^GT&e\| \<<6:~x@Un|p5O gv(V^; ޏQ)\u8YwB;S#\T.[ۂ%vZ!G Tpvn8;wgGd lT"kݝ>IY*WN񨵫*G* 9/qzSdY`r CHP(rO&sy;B grzP{B-Lsf*1o>aQ;@1?Ŵ:VbǢ%A*K9MG668ђژMB`ex:eO8dEa.]{ʲgl1؏ =Su"L`!9i2: 3 ,=Lo I0Am2sBqNԵhh#9?^M| Cmd/w-2/r"[S^)Qq9Ǵw1(s6T.%l7_]Q7!%gXdSRFGcCdр.d"N 4\u`UE=;vQ9c *zM!S!x/2DQf3 Zs[U<^vNqȉ[(+M9_.ǤFYY`6۔:$ɼ8;YW:=[ rRD ;`s I 4 e"s%mL+'!lo>8 `Qeg J Xb=FZTx;iqM, s"Pcd*git&sZ(>ۗ ژ30L( 0,%$41Ua\H: k-GbmK@AꭈlԖ4 s WDe[,P{A6ldYe=o#! %zS ;9FȾ!+kA/ebE?MEڮ&;;oKtDgOlzdW/Ob!IrzfLf:I\O]G|B >:eg _TyZhG_TNZ&~jJ7[oLnހ}VUnmUl6PiF]ޯg+hTurdFںܵB/V:7lլcw ti}gy9,V|Bh{C |9) C+G_HsNoDͧoHM?cG[?|+4Y+ -x[cnK&2# $Ev^Li}ȟr?EZ2ry|bV~r'sU r>GffPǨ.X?ZeY%Bi>p՝НO۴p.16$Z,(hZU"߫GuSlr9AئdLQGʑeLqz3x tٍ}ӗk|V$yyv%lp|(g_z0=exUme`4nOW_}%cz h7ERIAfÛqW vS"|&-(4o>Od#7];lL3_7MoZ֧$ˉJ-N߮h8/BX]5Kb|N/M_/G+.kMGw7BlaVQڮ!;m:BP(wG\8p9Ξ3RE%p+K G3vt2PG~H޵q$؟- wyB?-iR!)+"VI=9$E7mR %ΰg>>d?ÑkVr嘋AlYمQtxHb6m E)x_ÏdK /xNZͽ9,~4=g:$O 2Fo'H?_9N~/؏]^[)>{ Qɥzz)CҷC(4 wG4ܯs2]Zwr4rE]F8h9{t;枂{C(]:[69ϔL%Td\tBYv9KBzkRey@ffXKANE̪MgǞSOWc4\S DRL09h%R `ZhmM\ B5++cCKL`rM:G,`bĽώcY% t\[Zhױcұ2{J b4]2Z1drsU%Lhp!T:vD:6%bJޖ߁uh]m겶Jֆ> sqMP͹7zWR*U] ۲ת8ǻlp3_2ToUiOkt7!֫W{C<_ ^R\U=mH%f6$Y$:w{[`̒Y\\QCr>%,ѮW}\x*}1nr AqZ$N9aS>&ϥfQࢷG#S7=Ia܎Ƶ1:#y7cD?Ɣ=Twv.y 'B7+)F?-K)&<%%srdv2͢vXzwi} HúչY=<{a-@d2$[x"~M"Eե2qxl¬KVxrK'ףԮYDJ ǃ9WH  Z3goCnQxǽ/ͯw͸# b9ȼ)S#>۰_Z pZS`5aR֍iY0-rIp<' `ILQpB^YMgZ]\cK_vܳ)KN=w5~ҳdN]NZu,!9K_w-=wB:%;f{S6*Fd8|ci)A%SyWSJ|Cc~y9V^CD/ @9ȳf>xkBL,BEy=ҏ0lg$WywRITGS.UQT!s|0A3~W9zD8"Y~C $hY#|H, ĝ[I=؅W7)#BNÐu3'opa;,3U.ӏzu 8ar1b,G#ڀ:CLCO'ҐGŤu;4QD#OtȺp٢$Jd&Z},gGZ$"o pqZoB,UDN!yks2yU$f$wtL"Cq$HtZ_M돰)}Ma8chۤefvð}tbyϘd%{^t:&_8MBh&#MXBv B)pxbfU;+}ZV{O3vkTڰ#K8o-l͟|'rq~<^dzUqKS[ND->f R2n\vRS[,D3xHkZ~N{ ̛{qd]~|QrUb8.IhmteI+:-VOぞT?!||Lrp6,6]VoN|yK=ggd.Lf?9CB;,K<>p9ijdUOOiyNs=${ fF偧${7hI`oN϶J! K_x)Ԅx-.޷% Fݔw9bYf 1vBG  C<[v:g$jCzϾ(Um1=QlL 5rs:rL4=ү $!7}v\qF ^^QͶ _>Gr8_ #WB蒶ġud)d!d?!`Yƹ,[>{uoί>:sdぃ "$U[lg'zh=V̜;R_e9xI$:_=|bʔ2q@gIXBKց sqW+ie0W'u$祲ɡ㚩2YvNˠ }a_\,J:m;zB;2ks_}[WaB%C!@$rLN uJ%Ƃe ̠0̪D2ϡ˫lNy(d +R[rfH32'ʑ\m:5xM&ܱwwdݵzN gYc22 arS]/rz'2?.1A~W M&e4d4!xݬp՞ ~̜uҌıp=@5+LukA:,E7/1%3G%5$e-鐼iy_0e{  ~.{q釁O7`͟WƹwM0V脑\bJAb I!٨3KqDmKC6&IW* xT;db[Rws{;>GK[ohks/*0_XdF)-(|z55DQCh,5(Dk93ǥ% Yq Of{/|$g}sG3nF'\c)M/dRIEc_?O5܏֘j/şJKq`ewdMiCpbθ2%-L+k.Y)dV9!,ܕ3DU:YD#rjH*slS B>R"Rr& | &p]Wm:;:7B1%;+z>9L%t4A ւ<;2$hQ-tLٺuhqtM]Dz! $ E(#A1ʊZm:ߖ/͔R}?-W& e$ j<~lz(Y[6 1.* kZMњD#SQw l1gu骛aɹp^{SBMtmS7)T[S!0<-6e"}=mOYzCJ Zx%}&զ*հd싅2 ^Xy媰OLX'4~4܌ Gl%Ht@1s0qX1ZPkgYrJ9 !ZRhQYc(d'D\BXd2ăM&nr*Mݥ9bfʈ]m:;L< t j{ vgs.%B$CB$P ClkJLIdBx]X7*5+m| Vԥ0 45 Y G(UbH5_;5taKWe| 0 "V=QX="nH <{DpbҐXVY/3SBLl2Bϫ"b1361*)em@$S΄Hv 6Fb\ Ct#b[٪@1:Iɾ+"&m1=Х, NI˒l5J ^!pXt uRu{QI]#zOb!aۃ/vzc^/pw2H|_!QL4>4d] Ez} c-'JYBMPKZIڣxr[!ɑd|$[1_(ʗv}XV*\K܈JٻH$N'aWO;߬wmi}?xt^ iqW s"4/Cm̨/23,Ƚs.Dg,kٌqoЯD 8MM9ύ uzԉj= > &z=ULΘ>}4]D$#˾f}wu `5\\8O7,p5/7:?ŀ?;O|k ٮAL ^橢bd}{1]q.FL@&TQ噖^hDZdr)܅/IRޅy!s)E. Nf- U9+m7 hRW.)9։%>SlbRV!9Оc!Bqo'Ar;3;ཝQuVXsPʑ]lx!-T bTJ(Ec+AqTdNE%,k=&%QS`1$ ^vlgv,EaKۘ`w-T+WpL%GP&ДN D@`Sε*b&Х \[/M`Y"/2+/L" )ƕ.Z.c ) ejuҸ~xwlq(X9R"/ -.E:!-I88M#-㣂owة,%ğ?KoF[&Ym c^(!ɝX:pΩ6somݭl#_W5aLEj|cEma$Yh$"G7i,h'G|F0dboݙggP[| M)Wb-R"W_AHفa9l"Vu~ؤ=dEhèVxzW= Bm$)^e_#g :"wH ۵N0dyGR}`XFeDBlB攀K咒~`~8B7sP%f$Z/@(2gEr8SFt|7qjgll{JHHNAYf֥Ԍ@ٱsɲ"$(:q(VhQh5O"l%+BBG14gU /q]M UOg8;TU߿WMS?`x1ϯb8W*Τ_=ZuUVMytg7B_ܩi:8]ϐԿl(WŰK ^b?} |==qU e9vX; ?MmG7EHn#w9i=Ci$#YqpM7UU+h*U\~ \t"YS aX5a[8Ӡ^!J pYVGA}:g,*/{ s%_a;\ 1awoW- ,LbjT1Md';^X_CCpp#ɸ1~xOuJ4pׇCzzo[h|!jq( +hU/kT^:iNOHRDmps<>D@ GdM,i}+[Y,`GW$)4Z睹1t&h ~?/U;Eߪrj {u'{My})7( UwcGX }o l8"V|Oo;ޫ4'`ycK\'"/30qceEʹkuL@v{PvB#v \֔DTD9]D&f#X=]su&Ζg]ڵoRˬ=F)r*- &N8XJf(otFz6Zң0f43 Xz[":aG ^b2sx9^o4ݒv*F\#X^|Z(S|F30id<3֩SF~OlkUڼ m>ؚ m!v󭱥/hj$X;jAPX~XjC.~:eDUUE&H![2R:K02Y%SQD2 X'pU[PXY00lD\i)"ZANRxH]uv&f`chJ8:Wa ue}j,^ Iz>8O?VdW~,^DZjp} BF0/eN%0&)\3썷㝢W¯;}r_,\p[DSK޿y!F:'ܓ7Ř:}ցW*yS du QԬhzGx59 ԫ왽OֽW<qŎҗ&,煠e>K@;(wkƣIN8N7AE)V. )&;{FF{٨S썧3b\]FKߤ,LBI %,Q҆u;(ơ9&LPx-d#Kי\g bɸ 3$Q&eogT%:V2HBpg*Xt;VPBzI]£- z7 .e Z NËN:TAղ44G36D&vˋzS'32/Ւ[Iȋ_C>SZ6$!n()?* L6p\Me!7E(P\& $m#0yo-;FtrRpYӅQ=D{(_ $KT IF31z3)Z!sDL x8Wzê =?#m]o5;>j>wx׏L3/H "ՇR?Vpˆ(/7o6谙+RܺEsY_7c8qe*"|ڲ_٥3m\+ZC~{v֬j= yl7t;͆Je6ӮͶB,YczW{|Sxs4JaE %h8\SgFq@.fX|uCtwZ=㝧^3WϊniP8շ<ۗO~홓1.W^hvdi2niR.R:B 5g@DT62v&ߓJ ;RMO UD+PNfJV7!0\8:K8f1`]H 2<=ZZnǚ],RfeŁyJh2pV"qc* PLw|&'a k,IY'JXUPNYD dV͇kwB?jғ€$M<=3}cF!Rmzi1'^2,MF54ʰvI"{?sdLD A'ZM#e ])d * "ѫrh#v4l:qS2bbd7(;A # _Y4KL -RbV+-J)'jfDNUĐ>R9C3CpÃ:W/cg_&dxW @RXLlHeZ;7 P U!11̨-(,WJ+-v~rz8IfvSuhJ1fF%g- +iխg&iy l~lz"8[" w즟ޠ{goH>K`޹}[Ւ+[RzyK竚le3 1`؎Qe>׳nnN Cju峢y -Fa(Q՗ q6C^ '+1Բ5ԉL* &Ի8Cu<N;_~8'\s=0JEQ85?mߴ]5͚NmӴ]4.yEG)uh{i>$q/fBD8Kzfͬ99i By<Ӵ{<ݭTKu!x4dO,sM!?I=n޺VdY ~XF,m)2NJ.ZوnQIzEaRs !n.-:hw<4PX񐭶1GPP,jl̙}Y8$}Nuv*tg:JRA?dwu0<6y;3YݽPkyݧZq,ARgZxJ_9BW"́WVhN"DIF+ݹg6cjV?}'X9A>W{Lz>,a/9hk0e]̗ 1JvQI6Y2 Nhj7Ob9a cGSs, 4rQ" ٨hgPƷZk- .}7yH77{&?_^AMJyI{_surܢ6%:b{dhStt6.U׍y:u+F{՛4QRb%Rq@FRF_էg:HO&Q}XcSb}J"ɞNJ q^_w=a+MkicF|Hs<|Eg^^]7V:M.>Q#8X >vNO鯱Q0[vFC? "Ml rsތjAsQ7aS#LV#+yr$F۫^y]Fz`|yW 4d IP^vdNd9V'F֙KrYgz;mIQKǞf5ju0al9:$+.`;w; HeFgUm]uɈr Q5 3vx0֐Mg"Qv0kvIqcJ >; ʑWy-N5wTqsIXe?I6WV]2<`Jz*lsVkuޜ=SG>zN9#<:)?D[ ^$`]mhT Ӵ3}GխY0K6JIA]~KrKwZ=//~JbGvtgimλ/J_OO~rCGOZk&8$Jnf0*I[HJpp֋Т( <uRvD~aOKӝ}3%P.07ӭv˞[JD&YVq ۇE*l ц ؖG>*oN!W*eLAE2a|9XFڃy(|Y ԰budL>>h9<%d%ztVҿh97 =tW:/r9I!l(!DӟSIV`YoEMF}}׿9!$,"Z>;THC׌@b\B֠VotO}SS #CΙ{=pɝ09 Å9B~DxSq3Z8Z86*:5D,6jZJRb}9q(c%i6Jrw"Bm+q`6EZ[" oY&PAdi'MBoY36XN cўxL4FVȪ!ZAK*^pS*P)2ͥD|6@6L /;_0Hj 1p-8KZtV# mM"6%6-5gz<  [W l",+Lj/)NsCHL (938M[ũ%!C-4*XGP[AAdj݀N%kxS[kN IB%C>Jp)] Lȃ9e2lrJInpTi*JC{~T Uj躽ɗb ~锝ߪث)V|݆7>ٽݧr%ws Մw _8g[: |=qqً/Nh5jMWmT˔$FP՟%b P)qCIB>8"odwGi<>t>AŷߡO ^|N a] |L{[.쟏b=j+\U "v{CώF+<6R迹go0Xg&㈅IjAys pyJ]D=˸PL g&`CRy-m29dI۝ 'cՊ,V]ב% S) T߾~-}|]5| jpt6lVMYQUV^Jc% n&GnOd=~ﺳQ?Fq ^>Ⱦhx/[ћ x2Ywy܉,gY1H*ƤP&Zc9dB&I-\{)~jMg\R(- јXsʫWT($&Rn,I98̐Q7=kkљ"N w%W.HflrIKEs^UQ+ rsRI$e4I|+c$c4"ަzY{9oIsmMN_ljLn?g ־ Ė..쌋.fQ?>m|.L۠: nc(xޑ>y]e } wG7ƎƩGET XfLY0KiUf*'gDb&|CRu6 pls^@˦=\ ^]Hz/u(ϵmq{''c\)K.t.`V:%EPLlc/I+MM9%&^(^h&0)w7ۿ&-"ц j8\q~K/ZJzC|VmˣK,(1 uЈX2A r "2n94cmNj9'ʹu%&TXbL +<0@EI(Ӕn@~efS! 3-[eABLhfCϤQE/itAAZiAy0ǞNXHgS[*!gdJɴPdOVann[AJAZ%FhKQu"ڬJL SYZ]`U 1$Ɂ)3c4&.6\v3<$<.W5_uV9 UA'OIx>BӈV̶J󔵋'W/} k%aQŬr yh2E@ }ON/8ׂzl5V͹T:v"RӮӟg+9^.i\/kgsLɫlAc8ϝn>|8; ) G;=-5\\ɲjj$,f~%Ak(2E_&=9 4Kkp}N.ui_IQv%-M>A*a@$x7:yj楊M2WszDq/o_~˷uvi)b[v~[Ƭр+UWmy[US}TxozT30խ n3UN=[r2Ӛdw *u7VqPIO$-_q%ieDB̻f"*; NbzvR<|68W8s ,Q{2 &ӒS\n{:ply'= pbKRlSR SΒior J%|{ʽF[&bUnwxYaauA\w,p+D;ߘY_LΘdqW(Ȥ`Jv7*ɳvVs,V' . h9"Lk9`%`蔊Fv9  c > ̱,|SF&0dJv 8o|8x!m ]|^.]Ph1|Ͼ{o~tz}A؈*J̐yAMv02F)CdC!$ׂKRp%w2b<;d.Qk͹ўm[i1(y(JmLүO77_-|rߘf?% 5Nxz/fɨh) 7Cɉ9 I ImĠfD]xoܗs^rۭ}@M䆻?%k<ʣ`U͍5:wאW<ڭUU#ឬ l,YVYUIGЂE+!'d pQI6ΉVq{u,9Kʘ cBkǵ r .B$vڎZ7`rR|#_釿SJp }a:o4b=rQo&w ZIeɑ"2pH=$ psFJ<{-?Zxh[;uST5}d*g݀wMF+{c;cq$z- KNIlf&AnvO $ӿmQowǍ?m^&1_}!Jr j&zӋI*|4&M4UZ Ȇ=, 4HI}ѻ84_ߏ~j)^Bl>I\Q~ xĉkӷɩ]7:-իł9, $DC!7qIܞ}gn]V<}mL-k.6ԧK"sp$$lO?Pe޵=O8:Z/<үʅ(ad֨{Ⱦqtב Z3+Z#k"":RZj)ݬKﮞsSu=4awkRɆJ6]o=]{+/(15oَ6ww t[KTLO1:@l>/,~qP6:hހ漱Fx8{A!б4 :c!_o'bvXØ*9#D nOpFbo?Oi>ݤ駁e=Y=&҄J8?Zb0- FkDü"٦eYFu@kLqy οf]]ս~ y;N Cw8ýfiaeЮ/] wKEe E#C(59X mPT{t:{v.`CJ6ˉKZi{ͳ'yɴPNd$ Nj\0@5k/6,piNcq|~3>&㰛ǯ=lI6NZ_.]>O!LVѠ'$DRPE2/I*CßnL8#Qg7 “ B]7a` ά|QH* ^f.(%#P$eLօa4pL;&+$&/:1(i[Tu?.xР e':ޕ :x+O`H D&ه<O-Vi^!IwhLDc>IOVj/6.>~ %:oTf5;U v >fz4U^Hu{vTԫ ,v}5XseUcn5JɭVnʬNuȧ(Y߿8!rqG|%4]l\o)uZq\kմC]YbY7s}=T[3cZ8(Ow>w`+uw%Jt|'`}6٫{fIM잼Bw\z'tbi.)%diҁ^2*]wn%Q+bwrlS7 [һQcaXKyZt DD&!dEf# 1(w^Qf!*k#}hxͰ]gi{y\><=JY69\;JsQ( .-S-3d2<A*m2 X@A/ʹSo<*Gߟs(TGUo2 ΃ɡ21;A[ NCK,pY!V/],Gv..Rn1_.7XE,]?߂Kȃ(_IP*1<7R`aV$s\ 4:εs{P\vTֻl"#ZzT92_yW3g~g[*{kȴ;tŠ,_FY&MWW}4~nήz/CdDEӑf d ]D (LA J#(eK}A%Z<HY2MHJN)ʜzHΙr Hhe6XS R!8\d3˜% BW9PPc]ozyȍsL%t49H8&֋<@2 D358[2Ya`NidQe+SB/"Rׁ,2$fcZ͜/u Zebh?k#_CIhvR\>aJ_\sٹ#=;F sy5?_3\WV,|#;nW~Nͮ[S/泷󣵸_>6b6PYzU9bOS$.U*mPݸtkW/o;z9@!P[%e}m.ޠqt:Y\J;QXվEeN|9ݕ%]Z(+nF2:`ȪC2.!3#y@y?{ߣ[8׫#p)o̍bD#d^K&9L6*&+h]v8]H/{?S8f4îtכD5o?>L1$zE(ѴгƋlI2mJYm ֩C>iQc](SZ*۔U֧*U+ںUiեr?0*FZ)woAk V1s>h-6XG.3z%ǻ!_Rw+Tm %egMԄ1bNµ4=v_̈́,&3+別춸ѹk^R"Z䔄;&lQEV,C똕2Wva(( oR!TR2玡Bwi-&xB. YeuK>fg+Z'A裃"pK E4 gic8Md|HEb5Zڋ:fxyF^d5?}P'w&HUEzr7NTRZ8'RIIbE1}2'%d {Ǜp[c[%c3 !k#JK|c+\.jH)F"P+QЗ08dms)T.WPfX'5/]FT,CD ɞ5)0sS҈gE¥:htAڳRȮB&Qw_KͲ#oGDf#_f*!c5.D15>#z ^TTlPtA[ xiЮmtV*A5F7_:uUJ\, Ƞ-],b Յ!c1uh0S#)cއ"(4`k۪npWO+5+0" F42i,QɕmB&uT[PN+$_,TB24V^X = e%˲fH57]\?A0nP(SP|kB C9c84iXȠ x9_.[[ KQHN*ƄULlv1D!1M7fͰe~d Zp v`c{K;3S݅H}6=DD-$ >:%Fq:Ҫd :!JrJ2Ba1%OpHv9k=/3(sX ʃVsF<@E&rZ5ȼ`|ڄLkptqX1FZi(^BB$>yt_V 5ȤuՅ:@vdm>XTu~T% 9ՠhKP-W1hFymC5YK1Z1pߩacU0Mv>v/Y`D>"CO&XVXktdDPHc ޣ.O96H!/FsP6DD —9t IgmG]I@(ʠvo05K q[O κ$!H;V@@` )-f!c &$˽e;V5> PPEȚ\\"(ΨAb&#EP] <(B"(o*"㬪%e7CQeXYv wMϲ] [%VmZbAn7+itԞEw64IxTF+Ѐʬ$޲[6Vӥ[oU4ĽQExwQIX6 ئQ}AwbW-)K UK F㠝ns2e1WNiy\GIv҈`֣;V$FOB=V`&?v%[Y:Zm5EZSRP'ՓFCoׄ bLLyvãa#·UfĞT2j7LJڸDd]ж+9m ]PnDVM~nQD{=D+YLwz㐀AtP%(H Ш3ES'2(XoܬGŰhETϮ;lE^QH"NdviP'7 `!g0Eu7/V1Z8}{Aa6:XLg= U4+~(]IڈJ5(Z54)hc37 3E5jփ*M3| u21io5pR 9O:'/{iOVVO[oށ+qp)< h+J`[W0pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \}RJic`p|Dx<`ur+b+b+b+b+b+b+b+b+b+b+b+bW^Zp<̍W^:)>{ V*~ xb+b+b+b+b+b+b+b+b+b+b+bO \EL+`nx< 3peb+b+b+b+b+b+b+b+b+b+b+bQ ;dD/zI1}a-|x. o8*@gm@i{2O_? :"e(8|7ϏOަ ><^0=\.m-839)]L$4t^}l[;Oq4Kc+v ikM|V??;=a}꫗Ϗ҂W``@j #V"ъ2GQ.O?O/_%R0cn,d_;/.B(??zgw'߬{9E?9~.>~pOSf7_]A'_gmk5P#EwzpR|v%R(ًinyKUJLi C_{ sjxBnByrqzq}}w5Qۖ5wiz<-|K*?opv_'ޏ$rCOw/>Ρ{[*aC2liי}(]I+-NtVw% ˴WӺՔ5O:ꎉLRVq†7S#guI|3{aCs d޿[mK8_~k >ߠ#g=;<\cYdaa-m9[G[z[[{IkNfHѦF6%C#"#kˢG(k)E,*歝KflX3f94.k@mLA7}o55u}Aظٱ^8G}_:Y-M>xjh;ˢ;YrtGBЃCR)f tSNpV[,&?{ghz\[Ez^Dz3[\zѣz*]x*{*R;;|9Q/،5 ܤ1VT Ib'ݮ^ViTuu4sn'=i*rQepylJ=7Ma_\_.PRZht(Ph6|h'/65caƵ؁oOiT~rGceIUQu[N ъ ޵.&zNRD?_;GۍJFs{dO G8r,  ?i,ZDm"e|rؤAZ:wry ڙaqĶ:Pn fJVUD aAh-j.eUJ G:n T쥪#%ȍ",{Y8G0)HCʖݷj%qHjɴ5lSVOMOU= &W:`}5:mG91ʪ1.A95 ㊉y0wֶ&,bߦwi`"m-,2 1IO"Ѣ0K]FR2 C*0' &&Kt||SIڹ9akԏx< hbFFk^#vq| \Xl7KE#()\[Bdv XQXEAXyWA/6\YKՋ^^&WlJK &$LX|@0c = RDO^/>^<}X;Ees=r78:kY m(n?#o}y[' E* _Is`o׌Rяqش#c4 D#@#"u1׀.e8w;u|st@{:EKPGêFƴJXSX̥!"]PʺNz O`x7ǃѥѰ~^Ȭ mM5@OQn쒖sQfL7nvùM^4;cjfaUv(2^ XNioi1H<%ˬȟsk1n0V^pa1r/:qh7_Y-{S5s[c0-Slg2PFk"A𠢷Xk\*,h{ 0j0D b=kXc876޲.&f]['NstP޼=@8,$o?Ytەُ:>*G R(RVJHu"O;G-|,s:T3 (_GAgv1rvHz060X Nw6:h;ܛyWG:k ƞ_~-1w?.n`).'%sJ6ǀX17E.MIj)\0IZ"Y ކz-&}c$X^NlVʹ a`jĖ3 %$α,bȀQI.g╢h)UPΊrE] Cv{.& 9$IbC^<BLN!&J"{() z ?&t(feKL v;+vʅȄR=*䢷z;i)]lP0X!̼͞0ɜֆ,̙ƭ΁(MŤ΢pUML%O\:LY(T8IKHiCLpX2qa7\f쥭Ghw]Ë>J^Xd_coٶxxΨ1\miG4.uDc@;y21rz('K0vP5?qȾ KNikp)0nԋՃ\v@P }ɗ 6Rp]>ؤUIW 2!)®@t'4ڊhksH I!\$3EˋI!`J !`^kce0Sb9E78 ׋= bé$Cvd -|%KB,#"3RhZhIsº"~l/+DB'5U>Na\%:TGawV}'n8Z|wUs5O%\?8* Z軣{s}?W=BJwc U6+jt17ZTm]l쥻]'=q`6nҷ~ H_oU}+!e|̕m3 hueQկi~Z\שDr0VmQ:ᤩF(q;" #a86ԺaHiFcѰ>3E~LQ`v)rT`ce}}#mBͲbغ ;. 1kN7p* Pn; eB̠l-#^45^IJzvqx@삲{*Xkԭ8koB/Ѐ_fpfTtr {ug4p;O=Nw+G0lW?n:9aǠm[#џ幡OZs|ΎRʙ3{a iV$ LH5L/E\&(c$r  (t0)@i `-jKs$iWS҇tjO_d9QZj@[/ "l&54E͉ztӶN7a7_P^.o hX# 8ԀANyô@tO;-KeVnârojΗ*e<ZIB@n3þb4Ini$Sf S{#'5M .Ы m9؆h iIRRk_o'M胱ڀEUxde K9ۧ^gx^OVX'>1e{lg<Ʉ(T&SND&eTR6j%1 242` əL<bTg Bb@'GHz؂On5Fv^*GGw/2Rإ2oVr1.$c!b.jŎç-6{S?&NJCV);4ͬ'UpNtQsaYΪN9O,zwQ^7dQgb?W+H dM֊1OsH! q O9 R\Z(*M`5ɪL=2bpBFQN8e%#)*Kwr-EgzM.HhyX?{𴿫>'n?7zwmo&ʟ2NgaOj1ל. T"+\$jNf"kUa!T.KKyLdt>H5DYμ @T)-He.:%bX } {KHeՙYq# k{)K#YV9f̈q`ThN i!4NQҋ*SzJ?m`tN7e؞GWt0êҌb_< ۦ) NN~;:5Vc祚S`אlu0r;t>EUoQN;&[T}mFߚf/zBSJשMg' ~1bȪxQk^7@H7B/s3J^1*jd//S i|4rɝNS܃K%޷5KouCF.^+rmÆ5 Hq.6i&N誷TY;N"K(+Ikoc+fqm8!}@k>?g7O9lL0l'5̷ܺOZ"Mvyef~F[\}]xn&j}[&7j4zC̒5kMGP$w<ӇEr{f~? /n sp9lz8i:.뮝 wmЁ27>N=mFO =hpRM6L'1l0]׿yӏO?_]Yo#G+~ٙŔ!qavCSM$ՒwUS,"%狚*_dFϿsLߝW$8NZ nO{5@=pa5joZ5UlEn|s V]v|K ڞQ7N鰪[t25kO됮ARz? 4&wMtWҘ{y`݀@g0= Иe4(;A$4<`y+v$9yLZ0vT;!IO}0Uye%=wLӴGNq(%[a!FNH#F$2k- Q:tgkPAr:2;=Kܻ p%ZVv%؁/bR%c6(SZ8SFϐKў #`ݲlJa^% |rVk/;^s%c'u֜-&z?7yHWS7'NFxⳞݸM\9o?.1gBGRh)|\ `aR}J"L;I^ R[2`$˔RDV탌}Ki|-morY YgůkWivd3k 5@{Kr;`f>髓K=\|JPieTQiǼx( \iA1Hsqp5Z{R|'_vkPl۟5 @o͝ΥQ~m pkyۀ6%uH뙐!-(BEoP*nq]^+q10 .fdrH8+ʨU n `%qyå U ,&՞#,RFXpt5L c] `VWål@]w*1PK3rf;|z0bԻ<blf AqBoqDk?;l\8L0+'~|D/AA:ءkucB;(Wk_X j-N[xCUVʼYj.hӢjf4O3 Ż:5MfYtKV<iwNAmӦ&^ h6>V^,ʳ,?^8gY H(򥖜N)bxЄ~*jRDuw5{-Si^,m&hu Ny](l,Y D$a> #}\ocmrFy`|7:]\qL1մ{7嬬3&>R2ݍ~"<+IdR!KN#-Y2X&9xսaNұ)Kn _Z/){E7^~ kf\[}3ݷ0)|*۵; I֦o05ҦT*˽e;/jSYѡC4>~Icн]Н!J#p4`^"AKk HX#E8y!mLXB־ -O*2!)!A!GKyt"ZnqV Y1o`=& HoNۉҔ~wq[@q-Iw[ ^+ DG4 F DŽ`Fa} o)]U|p+umk{@iN -%jĶk5JdkFgG& PdXV󸭠 lc[G\5+r[M]}[ Jnײ5#H$UUmW(NFyP 3)pBs'D#]b~ L?E@ӧgůi\.8MsjERnRY+}%y.qe!M.+@8 `63mO)V-"IwQÖ כSj^GnfS.C{[UN$i)`>+rbQÚ'E:{Tq?Dߋ8~..z{↟0t]bUN.NȟTuUM5Rɧ_i*9h|Zi nUH7UVf#+UmŻ# bx>ʏ u ܼŲ;+>N1aKc6Ͼw}wʛAo16?00c/܄qQ=Т<\xtxm0P;t\z_C7 ,S*jڼgy56턷)1qLԸ*"T4=m*x}75||_(It! F$̃ڐ@#I'u\`|AbO22ߦvs:>.Q_hGqs sa ,3B1SHFF=C6B;;2,Bj'eJ/@/ʣlm1?~{:ąG^g y.e<]g.A5Μy&b>:HrhTU@/ 4eiJcfu9FN=iGtZ"q(XmW^ME@v+Ŵ`V`CU SQ^w9|u%zLױ٘׍2󥹙\6AuE:^?݁[.Qi,#LG5c K8Ya"DGOcpKG"h@Y(F刊Z$ NPB2Kpg,kqk8Nb ً;Ŧ lY>e0WOgY iO'n> yJ \rPCYy8KQrF׎;-btؖZv[S-;ف鐒C S#3MQEzCp"9a҈%z4 r]v2=w 4N:NI^:+H+`RQ4 g8`, X^9CОkB `؉cgkbGw㬼]/Gkw66ԝg2&{0}Eh[c_vp3@6mF)z=K>ۓv#N2 C؁-]@Hm kf jՂlj+͐y Z9vxQGV0Q0j`Ê)M9o.0L6pppJO=:KFs6WXBpЙ)ŷoM;"K7|f46GÛ忆믍maCQN{ Ymko,fb)6UGTg싘}/b&Gt:NgL:I3t&ΤG|F$R4Ng;5NgL:I3tYܲt:Ng2t&̯t:NgL:Iɤәt:NgL:I3t&Τәt:NL03ft0:^$t|AmM+k4sf5F3h\k4sf5F3h\k4s(}5ӂd7{^k4G f5F3h\k4[Z(\k4sf5F3hF3hFׁ.k4_e5F3h\R8F=k)6IQI)%3ZX&Z#Ɍ^Zfz~eQFˠkt ]mFZτ#:a`_ޠT2㺼Wc`\R6 K˭.#:F *W/KZ&ĵ rEPHb`1@1gz4‚Sg ֺ^jiߓS&C Ikifz^.=?x "bPX OJkhaUz K h{'T6*Y$4Q!Y5 9=hCΎ'ptmhRFsd2*R$+( a#G O rARvc*24D]mo[;r+E܋csAe. lnp1$[$ޢ#vl%+ D֡g>zD9jP1'1+@D Ei'5g,ӭ.v}!KB -LFa+ZU<ˆ@7e0%dSv55T)`JHdW@} eD:8 { 6D j3qv* d<48UWLOKg] ! ۷ؕur1h)wz3 =joa[+Qn = gHX%%ֹRjԀ(|rgTaֻsQo}.7!ʌ})%XFca'36gglUfX_B?''^^\Vd]myq8\=ov~5O+J9dz釳;F&8LE*AYz#*h!k9<14 ɳA[yU4/C 0KbJVK^Ggq+ FNR3 $*"9c7gǎIb mGMBٱ)2q*G.0eV ݢd"p\W4-.:L!32FQ95DO#(~c.&A59lMLp1GlFFq[5>/H:s#XEX"{o+*xdVi]8=ʦ aؘVuӲT?sL^p|JIK:`cҍ=b3qv{ooud8_9fZX(EwxBlIRrd Ph(-i1 PSCʹP#ɍŶ/7;O2GFܝ7f?~{f?f?x!wrEdO34T t:zeVP>}hsZQ:fVKkaq\ :U@X364%WgZ+j["\tzJ,X<+zRB$ B)$')7~L=rهnA~iX zzSvޙ֯VQ6Z]{E:9g%ˇ'D

I/&kjc6 kY&%TEL*ΈѢ4f";!7[m:*:8񒷁 BקŇ;t~|b?ulѽ@(uon~FtA}=_[3ŝ^׽xJE=HKgJy1=;MW6nN]jz{rī Wcl؆I7َo^T:DvJ+uCHY;(R7C{n};9}ONwplNwwduwheq^y<0b$Q\ϏNg 'I2X.Ϻ;L_.u,ϵ}J;!X)aBa!Lx b᚝=501^%٢<߼ʫsSTB8G's:gWGǁGtյYhd`W/vw~ jPׂ+VPN09JʈyXoNY=X }0;b,9DbK˯jj!{r/+Y9c җO®%(Յ91tP|'Dr?_pXObКͽ^`c[cZƎxIg !:tb= {F Fz8LHM6&e4]YiwM+AIGGId33s`-%[B4gO!=R T6`8u%ZW/{))xN T.4~KC~B7.z#l"s( .,8b 294Òv5(F?bx9Xʅ6Q195BCn)FDQplf2}4{b[ߵֻzmB[9ؖ#㣱gz4[jt7r>#O'wwEde^S=^LW?ϓ~ytcJ#YDfӏWO8uyv:R I>i)3aQ- %W>3|a|FY-vwg?kE/~+|p-'n =Ǩt=)$AZ2NQRU0 K*CB[ÞWנڒncL!L‹!Iy%qi>N TvJ{,ccA*Vj<.#eHYzx*{6\6e(y|qV%QκrQ(*Ikc ۠dYơ8H)$8R\5RXxm0RPo%m闿>wFo]~ȝ]j"eYOlÄW84}V53v}adW gSGz0K=LM> /XbL>UQؠdH Z]ke Gg,YZ HMaɚ"C`=cNc d9W@ X"D[9f4(^\{@ikN[D2)c;zKjzOcO|#XGnr8OڞM \c#@e_Gn" % .RΩԣd!X^RB|TDګ\3)E =)Cr9"sIbmW{MǢC o-̈|VMT.4P嵄X2eP(VZa*0u7"PI:K9JK+΃F]|&j!Ye8meRį 㲸wF'4.#kszbNPG0NeEE!i[Ah+ 'mKYj^NJFCQKa`R)h8!8M0b)*@;hHiHy]f l`:ɉ`),\IȉုГ/g݈#лd#Hi/I $CBaTDG_C]cdQD͌:WU&;ufh<7+g&+VB0fEkI\*wQM+_:+,rz8 Xu)]Oݖی&%Ri8l_JVV8GwՕ1M&ٸ~yxy~v828fbݧ0 's+^r4?5g^lr oQK-lI571Qe|=}n'z|xpv,l\±YOi(> x.F =V^ᖗ*'/ '089$v?ۗ~?gWo_Ugf,ZvvAo܅_71tF|M[U^۴0rUNXu9TKbclmmHw^$N[(-$:m5B&1?vmb0}ō|M,F>JCb= Y4,6U4$5@NJG/aJ5ȘQjOfQ`Aؤ3:1So'}iCpuK22dF&prFfE%'xɾV<'HuvwFA]U}Ä ];-Q~oujăA,ΘZJ'j$JI3ts9(VOȺ?(|ZLYd9J崧vèMc>KT4Zu>10GS|s, )xkE 2dE4'u֝P $6X&B näpVs"Olw9űGGt9yo Q^?R~ Dwb5}&G '$kiK}> !Q&32C,7\iYj(uYPplZpWPIu֝6o~OI.6F l)kꮞ4co6 m«,{A6 &#V9oJ*^!3BhdHJT/=f:IQl;vze +(37FI0z'g{Ƶ;yI<aVMdƈZ:wאWӦmb |! [5 GֽUc,pdA٤ ҂%!'0KWsS`$]ʈlLe85Ƅ̽͵ r k-BH!Cu8d4(4߇,i;_{%%2R &42nӌ +&/i2 dO i!a<:J\Oڠ2>FC/48pCA(B7Kd"*I*\Yu)R~ƵL=F`2tVÍhf^65MδQ5U9Y=j~prBME34+j*Rjh/K H{фU;jD7VMwmz˛jb̟O^O<‡/rqVI5ۮꠜӥz(/aNҠcub22=z[gɽNf#*=rFdKv. UX[prp"#  *:%nDNTO+茲+mZAuV\o1pS3e,3p],γ҄:M^Xx4m%9=xhr*#詼@c`c4ĦgGiļLvƬıh ]Mns8R=ڸ_i`Bح+vI ZWa#;+%iVaSBl2z &e7 /abZBʷDğF?&3w2hz8~ihޢM9wW\viR4Hǵo'AP|XH_<]O;S:6 z*ϫ?R#w&MY| Kq'0/NwLQ)) YCZb) f|)*#Br;ɓ3"1A0T]ӼjB\|Vk\Ӫh\ՔnOp5mcQnOFvTR$2W #,:A }(EPL삏kG[Aٱۢ>vW1aķtnBj;*b<6j-Az#X ..$oߒ((}ToNV)+:2$TA>Τ3z\nj2ZKRwxʔ,x'SY5O2=J/Ң>NUwUEnSZiIvZ_yZLcj_}쏦7K<eQ\)%9 6FΘB:j.d>k-xH[$Kng[_7@-ڱوV͵4Ύ`jTs^RH:dz!F]bݦdu\pe#'f\i&imAxF#d6J$Yѱu֝%l/f9oaT-K*$dVLj3T.'d>†"BLDM:Z!κ.ES^BO?0蘳"fe#J=x8;BRjuQo'vҵ(Z0B@ʐsfڀcpaupo*:KzщتG){\F,HQR-&3GPAut–iF`Suy%u*z-+|cmridxFf)' :DwhJr\(2Fc0:5KSWtC|NqYX;#uH=})lD @i` 6QF&MUQ_?}ނ8KT+,hkPBBRĖs-!)kIul` fYfP[ʅpCHLH`46: dࡶ!2%CڀNF*ѠKqjgݹSA Cv% ZSԺLȃ9,3I7a`)kh#.sE|WMJW y7>霖I*BUja9tGU?'__褽7dtzVԳ~[8G)T|rgwե7|s];Nv)=٩v޵ҧpt>.OFG҂7=x[OkXM=v'qы/>ӄ5-\/޶wZ)pu|Bzq'/x3S3XƻB)͹=bIWLEBуr4U4d9y:On+xt"`\2QdWn󖫅'GJF[7 .mƻտGa}pJcGU# &<̏=~wOUϦa''zLwU n{d/Gq5nTz߸!V`'0.fRTwxفǻ{olH*9ܒ*t!#;3e7;N$eElGKO 173RR&rӈ#N7ާτ%Kl"__]eRR2 aWPg۞]BwF]I8D_SH$ z柿Рo~ȬG5>0Cc:y ץ8t |`8uܹly.c$[SkUu|6X^)ub"Z!@"͹K)& [ }tb潌maJPtmwϲƟ#Z>-#m,l-RX3\-k$ ZL;nma%6LFMI,/q:oVJk,ad(Kd 19zK&gHL!T 9u* ajL"# 7$Y31hL@šX@:Ϊ]XdfX^r2- tH0bJ ҏV\gRLK4܇>'cYN]>)Kryz 28$Y"-o1̳d5uJ6šLq_w%p3ٶ3eOAwa3m1@9Pu%J789+=ǵ#]y4rQ#jy^C^yN%?Zu%}k&ZX1AD,H LĦ"-I#OSҳr"8u)BrC[%SYf%ih0~kZwm$_!%/c=&B0 K)II#%ӬUuo^>ɴ~$)FX,{+C9p<|'n#ýJ 8$9RT7Ÿ@ Ajִ}Jl8M>w\G3J訹0 *qwSEýwx6是Y9k>QDlN*c`\kVy&DƵ'R@!/QR<`5`UqspBFQN8e%#O-C9G%.JM..Hhe<V(qυlB>q[e9Z&*m b:Lx^R0Q{˳UE=J|>3j}jկ Fg-|"E4|ba@BINhY5; (wWGK(? ,I&^|0"'Ak≲y1Fɭ[G"9u:fidboh>罥Behuh{!)i+.U_̂tX8f8e$ VbSęאN>㄁U'BrVot{ fD\U,!dpEӑu,q%zTdX.HDWng-6lwuº$ o%gF^]M0ЕqՔzOW;gsБM,@B|Ҵ :`~kYйW (t (:4\h"&Q,P σ<7U L+`ZKІit҆ 'Bkt΅nP\|w?V8Tש]z7#ZR"WE)A:i B-Y]@'D: NyfvW W5:TIP$Zbn#s#=4:$;Yh'J X!-Y8 =!+|5 +̈́^ɫ Lr5!fQ6eKz5 !rRoRm.|̍.Χ@ `TQG  ^o8ɥn^6kq45hgl ȥgw#Kʆato657IkyD)#Yj`3?/֞Xd1X+Rكi[v{9-:A*J 6R։PXZH.vU1*0VAe,$ JiSrB’V[52T yD jZ$ qj|NJ£\0(9D1%b[6΁nBq9~B9H?'hZ?~;~$YO$D&,D&@'QmSƐIʃ:9E40j;+ꪣ,jTlUC{e_6/s p[!}]g*Ut IF$RXb]arMz/Q1tzWSss!CaWO9h_/GA)(MK\ǴQ";<ֆI^zyy(7q>q|U|nMxJ`"MJG.ˬqK?>d XoJO"Zq':AR(1jiN˨)s< \̰duDhXF>c$~\*&)"08,;7kpȲ[Q.Swl,x@&,gs(AQn0"j00\9b|O;g< b'zEO'y4#CT,aphꐱ1$AMD[%GR➈;օ'.6c;述x^]9 |Im)yWE}wGM9;lb_6&wvt4WK^;XVDVjPWp'GIpMLDjcDM}ƀ3stgDoLj>=.o!I !IT# ;׊+0⨊c6Tk=űE\xˌ^ړ(6Tz `ytp1F4i!n`M!P.%eᄦR-.n:^6:(. i$Dv 795FX%;oAE vbi,B$WL6CB,lP^bkNm|>@=q6K5  噔v"zkbdpwj7KQ\8VgMM&09ҌɠMzn;x+/{WƑ /f~`n }C.WgYTHʶn!)J#i b8ÙꪧJ{M&g_ \=9EP>W2OYZXQ2ˇ&Gx6z1bK:O`Qy "dꘘƟ0ن}k. ZPzTRq (U7e5bs4sgUc]MY@r L}Q_TgN,IJ61z@όtDLK*BT78ӸEywn ,KjT Lhg\i|òHIJK%?%Rd݃h9+]\6F#,Ud8HHSJl`S b6>罥B$2Lye,^iT7Q>GB^AvI@IQADg8FjR!jkVU2b mH2ϒO19hHI4(#HkbBeG&vGMm-7-}ܴlu=ӵdHvL_ͧkP(ʹ>}WM۪JN/S{v7li=9><تIMtT gTV>Ŭ8!)PιԲN9\ht̸[?Bc#I'=gѥhrN)R)\HFjlGz\V]VBsƒWu\fSr;.[v,7+;qw Gl7 &RIk8II iIJYCJRɹd&3 WElnxPCI\d{8+ɕeϵy`WV2 TFjlGl?k)0Zq,mO}bY|fk=X5'5U-Ө#291IOeRiTPxHP,1#0 _8$ NE^Mx&ˊǎ(I]D䄈'D|<3$Le`yRH1p\[Bddfr$4Q  jLȤi }8o S큋e-czYJvEV OxۯRL0LOyHV}1E2:'\<. Vcw¶4蚎1r5\8]FIھ~ԒH]}c=k';)~ojl(׍r4B;߀]ʆqwx6枪힪k] N̂F0QT[DYԎg.miOA,vA)[9$úI\07$BHZcL{J` ϵLTg;wB7 [#fMFPzN:?Ywl>$'YbҁS"i+y3_- xh殢_fpWe]w)5]wUX+"*l9-ǴXRZQ^ͫZz``\ ѯmgN(F6-nBhzg^KSlvJ@lkNZTn)̨{ql܎yv<:ToW/p7/unPQ #>ݓg[0[tf^v/*A*ɽ^vŎHSϲj/|>wцԨkf=^3;V)bDې0 2LvIK$tjOYQw 5w(T=@l㥷Eb^'G@Hefe&o`s\K SBY߾G7Ljsj֥e%x%5"FL0ѤIx ?|5&Dzy12˻\4OKr^ҽk~y;A47jN.52xײ’R4A9[Ѿq1F{L ǧQq:~6W0n㍻~6 /U;,X3 0 'KvW+M| 蛟.j~1[$omm@nKad4ɰWsr}Zz(ŃhivGb;;)-7Hտ[G;H}1Ma>ٳ~-XҞ2!ax!j#t\ 5>)wo O 8D rVESyݱ]ܴ=G^강"Ƨ`."vR-WѻFi1A#غLkȪH<J]dwikh@>лNOp\VQ,|ݴx)SW+kg%Ofֆ_;%Y} O5|}0\3b4rڷ_ WدPvN_l;R:7J˵-n5pr7ul>Պ[ĿkJ0AJ6wx~sw6΀Qˍ1(ܸo=9ݻvu?w mV=F3roa˙T,f {t(|\1j4nVc3MJ7'V͗)L LFP:F嶋߽p1a s&MNG'id*o'깢Aɬ*Pb_n   { mMoϑjgݧşK'f|) XU"5xd)ax2e'AF*6];D- j'zMBE$yYe#Y\;j]gc4)@- Fa,Z@J:2T{*Ecv aҊRjJ[FiZí⩆\s\RRRW:KB)",D{0)YDKdk0-sw1 m"D%̘̘*1 B7sm)D sDUnX41 +ӵޕvHg4j4B}cW K"$\0kbAcx!oA#9!aU9ج cV2L?򨔋ySV[+=DآuX#hqi~}B?dbq 9F`m=%wR *qՆOB/a8n@WIY(]p)k))p$hḽ!R5y۴ߗZS2CΪd)8$+3>t<;LHIkZJ),Q>R4P3bC8IĤՔ)5H@$^aOTQ^5.7D С`$vqhvh%\8k({%.P@URB3CaE uL.ӌm0IPFF8 F})@ʚ [# &p ֊kO 8W_ep;S?0 KjLT8S)VzIU, @cP4daJi6|+ Gl)kU4Q[xDj6[&aq]5znAV#C` J%I f ҄{@)35XȠ>hLS`#(_6ȱd31`+,𺌞'C) ><T yIG) i o"15QpY`P|@A. B LIl pȤ``gRv@IR`u#XFd6RD8B{hTKB(pseĸ^azDWU:nsX5]u"f+kGn:@C6+*Z@L.H􀁩p1$;`Kj,(tHVzPr ҈H1d"e+F胕EJôL燍U}4nh=DhEEYR:[Z#&@܊`mmmƂQ,TG''biVp6SRe%)8 dx?n|v~v7d*,hb⮆XM'G ]ru^s@ MހwK |Y]E!jUW@ @V RX|,#m60Ze 5sr~q^S6f\ R\9 {ߛwϪ+vGi.a6>7{5x?[V#.Q,~,O..ȸ^ߨgJBVcWE,wHQK dKߖ=M=q+ ,&}Rn|S'#˾cD@x/@j(Vר8pѳS+ @/n mS{Wg7Zڲ&Y'8=P,+\N%;pT!-Ư1Yހ@N1ѤuA|1,Tw/7ܒ ٧gۢN#d{ut:ìwzYp+Ž Z zOԫK˞zu|v쎹PL4~򇍗cx~5Phx11"}:YsA|^)<^_;Nw~ jAӽ*: _|K;FV1m}=WzgӝF0H.diǬFm|X$/^fLu5ˌZ.ܰ҄s+B2KRJKʔ Zjncv^˫[tW;q4<;<,k6cHZ<nVbmv ~\_4Wڙp[i |c n|\qE瓷9ʵ[0^sݦR~tmGNZ-__]ji0-UTO=Rna-f LՉby0:<=G^zv;}.vi%?7o,_g`=:?ok(?C{9}~57} =]_v״!gq坟nν['rްh*2W#ъT}bc5;V-3 7)l+{׊{>vŬf3,ncآ$/K$\hbp„ "B*Ic,HmX)xf˳Lg0P"NX׎%(6D+4:|NUq+ J/mrބX22,E>xΛ߿Dq#яz xvskvڷ"-J(bZIpM *Шmo A` -c cFfvxFm_ntYz_.0\xŲ?L:~~*'goxNٖp%7W̛Z'/MEmĝi>_.TzsD?>i'mYKEJ ێ;پECK[ôP.B:|\ŞU,=#@"神t9 +/g(k}/C|~#bEqu`fWwl:}8y7<:YHnN/޿;M|.7y+#?mgNpe8> Ӷ8Y[=Xi΅ܭf(U:ٝգ+ǸW]^λњF ԯm[(Hu+i܊V,{hyMG\Q7&Z\zCcЏOary~M@'zZtGy@"".>2YNWؤ3trxaZ4MXͬŢ \hxlxϼ\Mb>sՃ/lRN8vy2@wU#ᙼ]b&nhymˋi}{θS,$1ƬV}|JZ+'WI lEtXΘN:A7KLx];He\E_*i/@3db6JAi}0Y9半S\& iAD}['˗I<8j[}H Wg+.~c _/܄υGt6<Y^ewv#8EO( b_YY'Řݝf#@+ی[.­xTNȝ ʗ#]Lh .X$t=іbO '|i{?JRp?QƦN8"31iׂpbț$2x{_V+R0z9C 2;i K*)h .?{F2X.6#uwځŞ$pk,Rm H-)YVTIIE[%[̰Y/UrB"3&[|6^}OhM pNSle{>؞?6w6Qt=RЅyо|Roj-e;ns>;RhOfGwI{oUR6"Ϸ!c\gH DbBW)z 9TfotT7 x.۠24}Kת$=R7U4N94A}nɃ7=tv>0U6гlӗO܅/T>4^uj u̖jIҍUW:"#!GqC@1GmS>zQ*廞.V^#eu/eݭv#t݀d2buIqHKIvԵMiu}XO|Z8J2)I(,sIDu&R2j/5=:(yd6=\JfsjSD1wvGJMUV3CR7/-Cޥu nG --:/ܠn[9n"ӳtԆqO[N~Cpϡko,%Ztxol~)|eޭҿ~)~l§d&CYd\W(.hE}#h͚mYs'Yo֌ DN{ CpX@X>5(_X"RH>D"R TJn(QzSKI"rHtJ Xh]TflHNOd͟o|B.I=iar6BR(T>r,M9m);k*KFiOd 6Ttd#q {-mdj3q7Tr1C y͂uSž,m'=-4=_ jnw%$yqY'O M76yoZԡ!b}zmꕷحT͕%XEFł$ZJg}Z hŸfZJ%FQ' ,b.롄 *XRZR ;֞8{v6ӌ]}o q2sY% 'Qu..i|r8}@$79X3B#*Exj̥¢dB6hhh 5 Y QnK5;Ύ`W8LNJ3]y[6Ĺc4_ 1`[iǾxm?z`G&6tevD@Gʢ%ePe`t:㺢Nb )#d&CE ȱ$U0YGQ| \Ljruf~:}cGhD[hGx':H'VqݤYdD[3pтօ]j+f4d4H_VcBM%aJI+ 8ɲǤ{f?~qOcʹdWE9/] s'mFD֌&>/!(KZrE(H=ŇiǾsC= ݲjfCF'τBַ-C H]E1&RK|i&7R6I$􈘊ѩZQ:uEf<3LQQҋ7dhb1WNV嶥OcDGBLn#&{eArwcgQSޛA13}L}cgIo"3a0E/!h\.y**Sw}5&^.YtNkI)JH#W+i/%kѰ|{3qXK>t29*G>-_{k`pS/oZHUgyjx6R(D纰7dAgtHH c,B׵!`f3k=`ĂG;;:&h+%0f+R53EkASaFRDVbS; :y!YI+]_Br )bP ,HH vL-lyf6(]ʧ-mcXwXm#i Z *\$+n4&QoYkbȦ 1Fb Tu}RI%8ؐ֎ d8x;S2}"S)EE_PFP袣)Pw%)TA2oZe\эپ]DZWeǚ/;vV{)YLW<bIA9 u SMh3Z[lN}zo7S]O56-C%8~AqJ%ih0$ \] Rs+[V 35G;󸄱JLCvhl--jǚdzo.˾A]$꺓q#-6kdXU|6м_5xx%)s L:^ԅ%)KRľVrV$7$Epk8F!2c蜴LׁLHƤAX(Sp|)u9h)pTCF(,7ogQMdcNm&ΖUzNdR dM 阍,:҅R%CEiK$gzPKح:JB1,v=}8nj{qf?˟Ţz߿'i<u˳n1!:d}=k-ō^{ۋ*ɋ݋cR&ghd69J:e_4 .='rSëW\caC7'}I_vx?$$zu/tWjS2D`6&5'J5T`:T>׽9zON#~2-Ģ~"[󃡕:[˳eK*p9\t53"Ng͜c9tw!yvtbu/_ewpmV0K]wƿ ,ZN4ki|hM$qD4tUDg @łPvDo5rg_\0^6G6rᠷ AHcr%5B#RV(d,[)j=9;ۚzֶ ﵾkmy;&FƩǙfjln:hJ$lԦ`CE'z8n&N=30CJ> to+ykGLYdAu28nhқ?і0ucbQˌ3Il$}TJ[=G NHQ2Rb"LޅHC Gy%; amD6&`ΘIl[l&ȾM|Ƣt% WU #SMk=t\a)F"jS0l> (&G ӞCj@_OF o{ʞUPi`NEeY Ngû;d@Ϊ1הt ߚșTc>ǽ9_{r)eܢ;?:}۝Mue/n7y ~هכuN/u3C%eD-t!u9&6ES1S(ZYpUA+ $1 v~#~nq ,DL,ː$fiOs\M6`QҺg+q_~NO@vHv3Cy~F)JVVHAW֏2bF^}$^Ղ<Ԑ)4T,)ZTG2ԃk)SGLL%=]ei `2g8B䗽e~? {Y'w.0Q-H-I93|IH8Y`Y4UwWUrd-x)bj=7FD:y% DFᘋԒQqs 1zqQX.Ke:\~`:9dgW&9o'gT%b)D,Jƀ-xfJѓw?)?/lKWA&s|;zsBiy^atjNh<泴2P|5۟_^MȺ-Hn FȋP\DW<_ftNoGiY_m|;Xxj>Lڔ;-P@ˮ} aQ?2FKWQ^+붰[H2[1^QNY5f˩yp -thqJSmy1q7ɄQW&.> Ve=v:t^} |/ꕻNdϑZe-U}yۋEno} n1R=n_[7]_xݕnh٠e_{x/Z{y0Noy;C1P]Eިwxyr%>F}5kj5ݯEjr'0ǝGi\Zeyڞ_sXsO\)g][΂MfBXXsraήԽh5'`s\oRz&C GR|~E1hX6=F>tӓ=rF"r-Rfe&ũ)2e.]E0U$Y Q!.Oȷ1׳o"MNjYNXlJ'lMEU2ba@c>TƝw ۽)?ŨsRYePk+Y!a$A.`iaI{2%Qev ɘ X I \M#e Lq0Н(H)H9u5e2(Ô%$˼3A pUI K~xYhG+m+Ϙs2oGVwNh1afT#[YĐF"= on5lO7Rww'z۝54!|gL ˴\g*խ 1Q*\)xrˏ'wlU$mE04jfK؊Ӷǻ*s};2eQ׼nv7bcy>'l|rͭ ;~\1INd=g4y$xoOJ鼯ߍfl3bɴQ|y=}\M||mp]^ z+$̮{ |ZHX}k:K՗s[Ns =V_j-/5v/Oǣ48;/w}]w/}ݛw\wݛ3{0 ?Dsp~]{1Toյtj͐^NBrO7GIlqZV|;o&RJҞzÖnZS=u+ b~V7 ŴTy.ՍB1~@`0|/wHWgI^_O"+,yXB-s̊.Z($=٢m'=J5|zJjDs*xVۘtI(*E  6buL ˂>Ȭel-\gs9-./J޸x卻Gv>2mqV<#>Hn(+v>5Z,wy *5J1DrFi4YC}u <x,z7>#IIXў6ȃF9nΫ`Lv x A:yb*fW1x-Ea.˜T*d#@)!80NP >7ۋ׫kUm{;o0}y{=v/vٮz=:BqѕEy?VqԋA6Q΄o4q\btE& O2#EVY]ᒩŠ("K, UETXYZTxRhF V%( F:q6}JׂE=bM vO10iaZ9_w{CcxI65^DϢ`E< #X4vFʒJdLR&H9w]߉CM4VisQ&(V3Ģ:M;ī쁼ث^Lꔌ'#Ly&Q6r<^ '4%\,9(0Zs!IUuMQ(`Gs- k,eh`49wZ !$zt98b4(!1ߧte9fS˻ϐЇ }3Δr }..Nxh&9N[n QяMB+ˠzP &|6ںx -߲(f`5wK|倭Oh5L8<OR%)훓ޜ40~F|Ƹ9.81qV6_k RVZ5AanяG-H{^Yuw^,Ԯhjz:Y\&UtG3^wykkׇ7^CյW-qZ?/( k"?>=PrJكVֲ{\->5ZNq9{*BWQ&"IhU_VY"l5O1bUI4JxUA*,v2Lb[?Sr]gPr[QUW]%H&ۛ>G?@]>D_}~9j_~~ۃu|YjWzj3sZ,J0bL@)T&WYavݾW_/7 A[4Mkxvӫ_6W߶Sdxn*N][z !bP4Zxrim<zḱCsb0.V o}[]M2i ^V&X;71,5tAlkqV:͝y$=;)tuݱ3Z^Hŝze;xdWw=!ɒk۝Zqьjv-z׽l(C.H6T>1W=G&@!܏k~Z>snvGߧc1{B'o:Md})KAJ]!nA,"7+#Y$b\&z+ly48DAx! 0z}ר]U{6|kIV`D <1y1+r%ZЭ,>h)uҨhi)w۱h}-뤸#8z>Z)!IBfm 1fEa浘[6!ݷjZH Rs{Njw ̐Qnj'Z's9M/'S/uQHcf3(@22Iik*c{$XKC=x.f,cpp30fx̍IR㷝*blMhfV^ϟ rϐy`h`εTBDw01H>Ckеܘ gKe ltbX~17B(b{ w:ўyuO}fiqq]rIK[n}Z-m\'C P)6)tU0>^*q(eD)PZs^K3V+@c'g$5}-wC. y;*8: 00F?`ˀ)ˠ]zZX$vT;b!-dTTąUJr_ :[%J|SdzYK ^`AU/L[ayd(EH;@P*X BDJ}ĚZb$`@ө,$I4V S )4$ TC{1-C;DNby{a$<`"rb ʘ!̽V;@8|HbѺX,T0M<\B1(,Hٔ|5p@gZ9 Y?2@Z$}@8P *75c@0qPse{ R/3Y5*f=.`~[o*!Bֻ2ZzU\ql-C,VS&a$$/BЫ!1*`0  }^]%/] !RֺF]HP> ,C,Kmhv)c`,c n<jx(gnVe+f @beJ#k8-u4jD>%@"yQ8*M73KYXdkјDPA~XU;H-*paQyW\k )XSND_I5К-*jBבpiY xZA(UUj󣷊%Ľң(,W{-w aŤA@MZ yK%TX~0T ,1[YZ# R {@XoY/:Gc5Btῆ+IpU%jP''BNa~;E^g#Y0jaBPˢCHʤ4\jk01g b>E2y U-yAԐJkUTWr6B Fx NbF8JV(-th0EP!V i LtX""5~f ?֞&+!Fm(e>lYV`~:pQi0YXf>PFXU֪Y.u> _vȎ%h2f{! C9tĒN *,Xc#|n\רtECPީPz c #+7B=^r8캠f P.@yu/=^>۹ב}!uQf ?1#7ysGnڞPzQeKz VQ.1 @w|L sGR)2!&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b2<^.1w sdg@м=0 wz L;,>v2=]+kΜc9fb ح&q&u纑d>[95tTv'Vnae׼Nub{^':׉Nub{^':׉Nub{^':׉Nub{^':׉Nub{^':׉Nub{^u#bg {:zVcgg) '<ǒ+FNL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&e)%&N0 aAZ <1"r#L b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&b~죭V<ȭY립>6?1j<`rW\% D? <.Ľ\^e$k qT]]73^)Y4Yg AϗvReA.d^|7H]2pu`eo`װ)ݛYq8\Ă[ŴEpTȀE\fn~6ގb*t~b2[rM/b0tnFf~s5_o.onj6˳@6kLyK;uɳ~^&fyYMW6 N’ew6raN ̙09t]Qdu]*09'se>"q,\';U]|8ZUM;Y 7eх}e5e &Q(ɪ*s-܏7]kBf ߚ=>%˕9rbvKϞj=.&VV6RY ;V^Bi^ ͮԳmTs+0m`ApZf'QO?y0R~ˑ瞘<BA{YU׃:Q0f~6xᕽ–Idf1to5#zMi]PW`TiD\7QuEl(rM  ۣgn<2qvm"ƶHsOnJfN~EN6KѫiejUWlS u>+EY?׭8sOnPS{Sᾅl`/ H.7,.^/)ܾ}ZcvwӒr)u~R֖sJ23rM/Ljڕ>vԹ6Va UJ{f+Q`ڹbpĂ;>-rS-eN;I[(7L:xV+vr~=|rٲdF$7,^+a 0,M%uOa3F;Wo~fo7t><2V:$2~xg>.9J]9S/o0ɥnjKˍK$=^|lQ+TlpTey.67=7}KB6\/.o~@4ߎnj _&чv>/wreqLf0d*!Ő /e)fM2}՟{[,x#?τ\0O\ҟFa0?|4dbXF}G^i) %d+䜜{}ߛgpsMo=9:zmeMQ7z;+^ưaFJ?(gܯy IonsBըoW)zhxoU7*7XMvQrmf|neY#|lgqG񅒟UpLQ s/I/wvkF~]'je"`/j8'=Tw)fl?|QNu2:V^^yl|;}o;$\,r>`Ī΅JJpJ9y+>MuhDEa˿pql>y1M3uiaٍ,x@=E|;,$܄WP?f rӏ<%~l>Ec%UYsOȌN|9_5SrOńЏ{Mm>(`'4h&TYf)́,EYPYȡl&!gA}K<*^" ?,pbg6CNN+g'b{!}Zskb!*e" l"3VUis΢sF'/$}ɖ6J\ JdbebRd8J/RR̢\zQJwhwٵU[zB<~ߤ6>L3#i!&m?^.5jQXˋÏ )}i"wN,!añy>Tzܑ;b#[(;ɐk㱊Ku|\Vn*"7"[`tdvDq q{ !C?-OcCWAT3'ʷTˮfF-%.-4r`9#svkeeh;@-C 4Xmb;r<2; Gʤ-*c$up9%H+#{N'wg5YŖh|a+;2u-7-m&N\G K9fd,Q0(vJ32$۝$ʲl8¶Uj1ٮ&SGE,Z^o ݄86Ym͓L9;;gM׃s* E,A4UN U[f9^nexb!]&ȼߧo,8ˉVn<+;69!<]]'JU^9͢dKQ2,E#JpuTi#]Nl.Cqޡl5R디|ϝFLǝGJ $|rFUqYBf`%wCsx9u!QRnvi;_(;$MR\.LWYR|`Dn)yE1dP)0.PeX(x8.iTGubm65ΔƮ4|2񼹊s75 `-mm: Skа9b,tYTtx0;tk4&6J?Sou 0qpi8RD|o-uxӡ=ըR, .fo^^꒯e"˚]AIY?oEJܗUO\we( ?`vw@ {mLVR`ݛˋ/t/.. */t.AhǝvI+XAe6)%8ӥaL1i?ҘVdB*STq/G2>tǻ)g=ik_n1غomȘ F(DjaCe3YJWex%tf}bimzY)l^g7ѝ0u?0нz{)'ɁyIN.'g}0YU$!BJE+, 0 .SLᡕ0;"d곙/QV/,/.K7G#^dRh;ޮዻ R{m|t:ޤih(3)Aify{V&/dv9㧺k1g?K5 ;a|j|_7Jcof2 S$< _i޺vX7bbT55*8UAr>WAJZ9d4 4)ZjޫL9/*#Sns;"߈L̓-UXٛHѤ}ak4T}nE9wH=sH_l?fG]o="0K3*!yǟRRPt:ZM!h3hܓӬ%0@R(A &2 ' jzF'l5 !57 0!]Vo@x8I><(X|Sq;C$N!8@k ^$(蕍DZ7a$|Sʙ)Ђ BdbBN( &dC+ v8MB%ADF#O({vU fFcrwHȂc%HP3kAxh}gG1 ::DG쬆#Q/ rgY[B9zί'!pMY|,c5phLDЂNJtxU[y6Dv- ƫnT붨Mjm(,(a۠ 9OV3HXl" m,W #t (恢Z[.0[w#',TzoZvC&4dSɨ5rkkeڕs$J[1#)iCr6>YW[9c N aRw(~ܼN4jbmA2v|"ђ&\V+r>Lĵ=_O ӈNB!ƚXV/ɍo D]{j!TMI])^6LpHI[<_@_ǧaOfnGWSWvQs\*JPH[{(WzOjךW=]Iȉ?? q+W:Sn27!u0PYUra 2yF@"L.ٴvC.ɡV{[w[}+e- )(ndښyǙZM|b5 G*gR+Y"-F<'фZw(m0˙^G9SN>ch:C,=74{&lQ]p-t(ȳRZ6>+ P`ĥaHR"%jQ%jS|Zܽ°\zu;.eZ唘(A|> cNCj~)鬗Sp8t>R`aStyaUke|kN_gxy.I뿿LZ&"eƠ$c(cZ z9<+=X9w2*pyKU3\3/ԂA?<?"Q!$E‰Sn3̀7"rvb*Εv:aqڝN=Itgu]K^)N~n;WRJ]{vgޝ-N%No4}UbG4w͛rkG=Ch!x)AxpK6wKt/?<- H˛ݵƣ,~uτEּ:#*υGOPb63?~F#3F${(xyўwٿV' Ɥ+S4g3lcIe_l%z*f6Qo*]u::a*6ȥ9_㜳15~l88):' lXA6m5ƘikouMfQD1wIUIlN3 q`5F 9%/Ϳ6p=|| dg1$++ېsWF੃-  N{2)O`"nKƵ q6*͂BՖ4|]$I=C %kɿ)h jW R=#4 D{\= #ZG$:u 0!iZHFbXCJw(i<-57w!$ cYlRe|H䊋\Bi񨃤G4P~>`rs8aFUawBF3(|($2 sY2R&EiraCP-'"NbVJŤWf'0{SMu^/C~ RB@7S!sLb T/ATp$6T8{ B\\I:b8VSH̭\}6 5ׅ=k>(o{KԨ [ W([_F 2:XJP:xV*ujb$`LV iI^/ '\K-;UKIUW6қW{>G6J=jR! Bh n|7V_?el0j@ k%HsEAx W߫y'΅%YXVu6W*3m5iBV(ύaF; [B9n7n`8ڢ {(?k\RW ByJ*Μ$o4 侜u7U[@ឆ-[JU:@'Gɭ8Yˈ!UQ,pt= ^h0lx%9ˠHR OvtavRX;&U4!3,jFς5̀< fOӫPV oKlUBn6aS;:1 圴o]K~}!|>RPb̿f_q9\"E rbQ78|Ju[Jй7d^-}zhťFCO^ љ ;@d0/Qh4mMZŸm%tD2y,r T4@Re23ӘImQ-Tٌx"mWN ,N+\k-{bFZN6I#iHi.mq H뜖R[+# ʍA1zGNR!Nkpޒ }6Ov|UhVhL'$ f=y)D"'yj1V@礗}:uł"/o*ff@*㸀ik?ӼXEzݾ` UNfGsQJ/lMuM \G+);;|lS2؜4'q0I&P L UDO 6_?BNPCG|4M}N?= O e_ͫI,CyRutQϊ|#2)2[o}C}z` : #)$_mKkl9ODI|=풏O{-kQFAZTxH 5w\D'P6ۘt gfY媟4DuL^xh:ṕpi"} (¨vB0e6e4OxrT3tk6Gia 5s+,jX tQܐѡ=k5-,߿S=qɨlYA}j&bo=FRB%-=;?)ia7U6þ1ڛhw|kVG|k7hNRtR8g4CtL?ߏd y7<9}}/{zTiI锕VԝLw;ۊItD48}.rmm8iwlXjO~D/v4r g>lڣJ+Ғ1[©¿OɸQ~*^InD2*UHT4"Q*R =r}6mB6GdTmijX;w$`/}`}7>W/'-vUYS+ V[,*to h/*գ~뤒TAmkA5ߝܫqII@{ߙ!=#0R/@Skzs )R)RƘbM _ʢެ{=R64e_1p/Uۚ1k}V2c`.˷sVu =*s|y~r4ष¦f~!DEU\ O,JEn:[ٖ'73Ñ77}D;Iw<V 5 h2g_: OHrzi\ΗT S_)"YqIͣdR#Ce(<*E"iB>&&Z {24?أ,0W&  ڬԌѓ,r{p]x3Ӭvfz4P楱b8nFҝȪ9k ɰ\Y<g4(o:T]d`Xd-sv7ܧiUG\VN/]uģ t?+w\'h+O'M蜬*kU=hԾ 3k}#~ȗy\R./кd)+kCaX3=:WBk&bBFM"#l,Γ1,2~΁DwMV&v,VLnP}gXZLy={~inhިMq`d͘4#"t|khJV}[;2#'Ȗ:4KpjsXi'ducRH[;:}3{St@견fLqKVP|U{0}-?oFX_iAxd:p!3&I{w\ *^5Ss͉o#{MlHoWhv`;hm*E1JiF"?ù) 򧋒ę"tÿ~27Ϥ> 3qn+}<}9VF+LJXk%@c6NS̩wſIwkF{$tX\ϋ:_@ b͍ 8b~"[UıQipCz]I1Y3^ɵf_@"V\ؾERe)ber>/iА6tc ‹qLI}Ij9^ph2':Hqi4i#rتk &5@Ga+>p:^}zc2,HJiE9܈y H0R7À ! !*yt;Rw2@CsZ>sBa'1s`TzWYP.[V9N+hµ5c[K,EhGDho,gO.j >?1\߷60#cFjiI+a;蚬Hͮ젳:GIDZ%]8{FIe՜]5_urPY?6h@z*")\ws~Jt! +"r"1iʙ`ٲv-(yq L^PL\ї 5b2D^yT_jeu%mm3%m<z85m PKXNyNF)H*GS Ϟ\?ņa _7Xja$SxNJ'>>]$򕫾|Q@zmnbvaZ3]M3X*0FdL4ՒaR-Ԉ<櫇r<_W@46aQe3j(x|,idzhG̼"Jⱆ35q6M{o*J8+By7. g?,l%G}:l9[W$U:?7)A? 00n.kj;%T&x\GU4{TX}JiG.l5#Y#_F1bp[;W<͙\[KXA^84XwK _sE,|Aq?pBY|^TO8n)wYo;Z:E6E#n͟⢽w hxЃ(@kLb*`ĔIm2FanN/=*{g 0ʔiS{ހ93[o'\Yu}9ꍊKo&˞ 2VYOzE2.؅F3,Z"XvaI RsUMSY%'* olñѱ@h~1H{J L94ގUsYDk*͵fX) Vx}Cu#hȍ9ŢaӖ>= p#4-, K *b3F4wVĖTUWpFѓ4|Zc!JYiyjRaAm& vb Q,LZa] zd@ `HGL-NDhY/WKQ*Й= lGTjpsBTGiq%2 nu09GL;%KغJJFi$f:x"F-Yz1,d-39pwbRyB qZm'Y-Iƚ}f)4mĜCh> ⵻mZ DCٕ΁ј0e^ڂ%6ͷ%T ج6,ąĨ4T%+RNglqv5_ӎw7Ѻ o#b9kt07mf.,;RPg)۱5pÆo+LJ僣7Dgt \##|*!Rx5/3Y@sT\\nj${~D*H3>_&v= D)k>c*e̦&}yqVg*EjGwaq yĮp;iP?\B2mobMbg޾a'3')( 4k??w)!:}f_jLFdj՘ hA k(!5[}IBLA"BRT[. ^}mopSz-]C'IXۗ[C~\rBhc6 2t2ˮR;\^'NۘӸwd.c#lFC0Ʊm –@V` pEoM+Tȑ`/G{|!SP y$a4Z9NI`Z%o՚eЦ$m18=BxqHNY4¢ZUrIrz <7Jzϥ %63Ja,ķv_jJ1Y9ٰB%T Nb1<[1%|)VmVޢ;{:+vjuj+Oս+PaΕ=߇,!p$yA1 d!DCtu7X}"WZ-iޤ|u8')a/fa~lP= A^uktUhN#yZC*Ng#:|#iC w.býEMA9 8hlU"M|Ѻt':HZ|` t};̠GЄYG"K΀ <CKfiMބ:Cu85ξg х -3쩍R> oL*$kތDTm:ǘ{SF 3mѯWWQxMF 4^'{&fWPYW"N:h"0<=S裳DRY.)n\a D ĜCY+͕hv Cd]=n@q5.6'm6݄]vQ:@6YNʶ=MIst&e1b~I/YڤWhQ" Pe[(QzwBv6xŧatý񾳇To;AUfiW+H;i[Nvѿ>6-7;JBa&{x k]&BjQS,s/XzϿ}?jV~{O޿j"!(5|"x|穗|O}UȩV!r3]& .g*VEaj6 Ŋj sY+8Y|Ya7*52 0 z A}@o1gddB*8^LgFQCn2z5UؚŒ .flh.cbD[T WZn܅KW@8w Xg:L]0/;\=Ea)oc3] K裎hn#\nO軰.eYγ87Np*~](G!S1 p" Ds&Kɼ(#+_igV"j. Y;0z;ux]I9fN Ge*hTgzI_!}4_»}FH"-MIQRbIY*R"kf0- VDf\vHkE<>EEd_aMBpA'z14T-ډ\ m# &.tqeF=g+Dˏ( e; u2Ȣ#odV'bQQ iשm(ln&x/2YzAB TЈ 8N$bXSШb=_tbvb' M41L"b5إdC546\VB?t3i1Nٟ/'_KQjA0wc9Bq1բx5v62p_\FviIEg\8/V?:_C=C>IH}RC<׮Js E0 ӗk)9|$GQ#FcjtV@<=}R; >,-djL կɬ6q,06!GGZ<ؗךӍ*.gc+ҕq(ah氀dMi vB$r)!&dQjc\]Nx1ąžEYbGY ZN2G@0J,KD㢋dKt<=;< 0,XJv3ytzl\|@vYsW5i.rAb]XQ|~>iQ̆A\TSCxyкҎaj+)?G(HV29G8bʫ(JVu6{DŽfYDzVmsZB*񯞍} 9-b3ZX8ɦ(g,yỤ6]3hhcGcl\,V~RAJHnw'VNm%o/_q3ivW~܊Zjh 찔g՞>^\Ѫ!3j\)Osl\9\;?p/} ⩻x7, khTPul7`N \@ycHV} ʈXETC{ UB Dl+V﹧,*J |p7lލc1QQ4H6ǯ!{CڴeFwѹr"M"HA;xM˜`=hWrI]ĭV_,}u]N3m]Vt]vЗ.犯YukYCМL>E554 USTZU)r\USTmڥc۲I`sII\M‡uB#96(*}qȥo;_A^Bw6~e\o ]")\mFtcӎΗtl&\t8ެ?L{(1!CSpRUקgdD2K9qLmcr;U˒1q [̗%lN[B#JjF{cM)l-Ϩ}qm5=-n֐Kt>кЧԴ&]{Cn] C`tцzR`;۽ەa~:մUJٖNdy+NϜH n0x(2WdAm!Ho}C_ǕaY\/q 5~;MCA_}Yxa {! QS>qP0ؔ̀ݴ]1u%']t!Q lBСE4?m ԃTT"Ц3LȤhN)cV0wT$;2,_ޗ3vq=g+NUG5Ƃ(R8P ,&󁆐y۵E7\`L+YHss34,rqnYdMv^N8o5U4ӴƧrbn _Vb^~tVȐIXO&% oB+N" }0}9`MRG8E~1*wFc]@h=}R;-'晊ϔYmܕR&Ǧ袭A߰(XcXN՝ צ1*8,bOk0Z kTu o=h!;yOGsbm͇$0Ue8:$}Ҽ#dW_K} {PtC[Cx/da\BW~ qCzˠ䮡B00b|RvCX( (%5.PCg8]G1]T [vZ1wi5F40&ҌðڮncqLB6FRxpqjOZX}i90į|8rH% ؑUP:nCD*¶,J0Z(!B^bSkMV=CAοMo?Ԫ?*p5SZVFWj@~U Hȇr#{\ *Jf8" 䝡RiP_\ wnn N>QHp?džrJopG&}(j&=Awܔ x]'&ؽB!L%. oBw#֩Z'jn mI&ҹOzc 6*& z ͽE+3X CIj" KSbτ2vÜ^a]Tm"1LjNnh/~PM<w~* )a=&ŔмП޶)E:Kj!h T$4Vl0IVh `2'DD6C T`^%Ibٗ`c 9gU/46-Dv"ܘ, jśTJae޿+qNWOo Sqҙٜ_/v/8@0iй%],+:V4h"ӆ$ 4p @00D%z,et޴+?ٿp GΓÔw bSxUJA:݃_?rL_}ModGv4BOǪzŒ{!X2$.l9&̗l[BK1SEbпxMN4Iy '12xo'?U9#*8 *( Lr1whL ",$mLz5.8XTTc,IG8IR.辽/ 2 hA{ F*ڊjhTƫTJ$yƠ狳gIѷnՑ5NKC )$!Z(W>;I%JnflXk=p8ӳ4DŅÌazϻȧ2• ˳.*aҨ` yю' 5!R ,E0dLY?wjbWJ|N.05 O?ddxyT '(}͎yAd?zq >_.g_NO Ua|h9OŽGK?=ėakۣ섮}]ml~[/|x܏90c^!1~O^J+}f2A2YsFqE ̿-W4W|tZo;4L0AIlw &K;N@Zpi("Vq2E`d˘dP;j;o;-{wP!xJ}ٻ8$+ ie%.d{ ̾ZRDJGlފͦ\jRհ7V]'##Ɍ(\W6[oX8υK5Ո5v|N8f M+P CB>o /:^|5nʃJ15nֲg-A .IOo'D=?bnݎ_/ʀx0DVy/d 4Qs;e/wr<#gC=3 !ɓa!HsW*ϫ[yDL)w{57q쏟\twv*⁝>vvzSEPэCN!v38ɣp\5_v,ЊiRm08DӖo;pȂj{Tgn?%eߴVjڿi4:ۍlؠ >)f,Q :lj`46c~ZzEOK/uӱ.b5\FմōS9"K[ʎz YV 2:lJ.^柪Xxe*^|g)A5|!&EBES-yuqwCx_,OTE;`8uYxw>5;_rv_pIzڭQ!Eoo4w>x@7AQv[z;8qo}q.skEο}_?>{2܌Tr3RH%7Tgl\yW~L40D;f2Ps! ]ϯJuEBJy?VҚCjzaw[G(Qq$AxC+jdIz8ћ:-D?+sC1[>Xd$TK` z+Vdm-g/9Oק=_5AqWN_:c;&ł,Q/MpK1%'`1LT3r(y"ۤdQ>RK*\8cO695ض_Ayv wb[oF볜ϯOuRX_ǗuÜfxۜ~|5Ő.p+yM/kQ#ȌAf2AsҪ: -X4l9mz/ժm5I(;("s`Q4{XUuRz,RR)Bh^IIU >8, V (|UM*R[~yr@Vq&rVI"ݩvM 9R9kJlZJ|vAEH>A:_erVf͌RdA}=ZYI܆A+C1Ÿ;_y,Ǡt7ucŎ%Quuu˅*22ۢ&r#KS;8Qo;iг ,"Ç? lCm^`; d>~:*qcLc3S{ޛm1FS!<cj + ^b:u' kkCZU0ND(|W%ॄY dA=oc]hZ|<9~[S֕_0Tz ?wZVhUF84Q钓h+Yeܶa"NTf-?#DFђ@EUJĕTIڝJB(Ej}_>ޮ^6|EJ)yf䙑g)yslbi"b^ծob+*ska&FP{PfbI)}QJ@%Vّh8XmEزůu*L8 C*'58 jNjjX(Hi(uRꔩ k%BnXߧ٦9omF}k3[QL[siE%`UWizt˺f5uʹra JS_yƔ#|KSw v*A%!c1Yǁ^-go"Ҽ?/%}X/ز}~[ƫKZ 8??%Ru,ٺlc״lC;> w0<f. fbyt9jlPf-lݺmRwB,8UxJM,(['Kl -%htڔZߙ2ɝqn7'Ϧ^ps=-R>73m7]יVdv"Rqr 5:&f 1T1;Y8F{-嫭'Fsr?-}'v6jqʍ2f7̾eʹݵ5EqUjd(Zk=?X~9FD9z_掰Fj(Ɣƞ(c rwvwl1y{pI Xs%#1༅仅 O';E^&=?-έ.zS!M >}'e56#Ʒ7,h 㺕yO zqЂ6 80lR d q4;+8tu5  RUu԰xH\oz*<%%8wlݝ1Mwn`np &�ThCȥTiVzaЩVբR> q}HB` %ի~>V+`\R-|q_mO*N(f]/Xfo Rp(+:(R0Q#PrhFK7$(V0e4Ê!Jy+0^BW4lUQe%agbG"3f9Z7pk,թȌN]O9&V\n$ٶlТNHFŗù'r=ε܌Ew[J2Cɢù}&Fɥ6 DBɮ€l3@#J\!ٖ*NEkf^[)0[F9@/"FL},kZ,*a U#>5&_1.XKc>jeYF5Gˠ7{f\TQ%޽_ܢ}&nMz{}7ZwG1IS;a~-feVB8(Ѵ;oueYQ3=FUóZi,YcY8-΁U᫠D0 GUz&}.",YӶ ܋s%)l !5Zquq91 RU\2F{n%Q@"csI`tSXSĬP@ab̼=N'ꕳ~z{qCJ܀ ۞lXzEֱ[St#2tBb_%;z>SSՈPy5X,,JS^mL-J [RHcYҋ$Gϛ;Nţ3\Uix9.?eawEEo]awyZa\h-'%QA%>o0 Ms} ~80^ձIi*GGpyPנ_/F{ͭ˫|;(w~k7yzƼb3󕪨|8>V_3}33^ld̛ PD?~o1rAPܖ&X4gby@>[pdn6{L~YPbVw㯟Um]Z-8rH,[Y,r-[L=v d~Ncn.?pͻ3F9fuGdƉHoBFx`6i&/Yv' iϭ:Bw2u0a}ݯ5_X5 vPy,]FXƼ@RrN<Ǽ#hUP{/(Wx8yK;`P_r.F<ͧW1l2vI.Y{ţ=XF?T'Moy{'x=w>I\ud( E)Yd(;fn^T<־q\P~ {7}e|}B;.ncg6ԧ=y_m|N7SPp9֘grNY/.t6,ovN[\x& W!!=_<-; "2;[ %8"=O`N,7o.@]0{ptO՘x]{QYNk^Z T$x/ڭkAOe{oS0A)rQ /{sJKό4U8`H nC3z1`ݦo^O@]9s:.:M.\ae,` HUt?KDmY@wҠJl~Q ٖo/_l%7[?9?Ty&:0>pN!# ִ`IF'.u&c\6Rַ&N ;ؙBM5:]ic`PWkt'Y LGF;6dV{7,޿Enې>Ӌ_.5^$u>|:jo?WU?MSmϧo?%H߻|~x;h}W_0>>oe*9oo %P֋9痿\Ѓ?b1.[yk1JFc;ƹt6[SsyP^mkk#*u]Tk@|X&ͮۜdڍFJi ^QK'R) :;aTlH;kkqUA,&2RUXIEET *9?N9 .;1Aum[`J̙R俿{s/5]l7F:xܫ3O>X g~/^sdQ G^\GI 0{BL8>6Tپ&Sٳ.M9^5x7'Uf^U+ خCF#/]q*%%iSCwd+c1'/05x7Zƪ2Kx"t_[!n]BZeaUGq! h34gz̵B$!_j2VȘ+'-_;hmV/y뎴Ջ1&x YIŏ`{at?2`Kp^}ߣl sGqg(p"~ߡ@^S '^IiȴK>tbEtWSNJ}Oj̳2 bŞؑ>\.%Lq;ZQyǤH/aNVZ]4Iyyb(&LPQւmqVYU(gFJmgS5B C^**Rͼ=7m7&q``r &AZأbJ9ryX3פ[%cő癔 {jZsё#<f`3s4>؂tjCA_fmulW,6q6j[NmvAĒ2҇8aȖm8|.- M(yd[SzXqM]+gP1m 3"e*#x%L0쬎nIO DOI9Yf +h DZm6g!}KB~h9l`@5g;OCa&ڎF G6[_ѻ>(Yx<ul 5oc7u`nxx~z [&f=&kloFgZz 9QkAr߫L,,Ff.!,b'p]bҬ]c Yi8="sCnT ϡ_y>=y:.'xU;+I~fEYR͂~:=t_&ߝmv`nZbzs#N0:zwo3٠{$lJRUs Ӄ TWa`Y@ǰfBR8ka,Q15ݬc}f_02GRGj< #{&  K 5ó FCUJUlN kd-C-w9 }I<3(p:ymN`%LbXёTu_tLM*LEMֶp\_cb=FG'3x?9\Gt \s{1nmwZJ%Y2-$@,VRXKC ܠښ8תlmTjH"Q*$H AqI0ҠN0MJ}F)m_0m( %D=ERr9E bMqyUه͠-iq!{8E}曦+ԢR7X}CY/MْbQnyO>bЊ%rsu^/,AsvRq+HW ߓb`jYxZ6jhNOX ~Uzzl4<`) |A)C칗c)YVʕst~g8)^0hծfM>/y,=F :d$<tCTFq j8}fR1h9Nzi}%Z;3rЩȑ}T;=Ʊ1ޭym)xp4Tٺ3zdslY8> @ztU8226 /\P;8)=>9ּ[gJ=>Pnb 9LNvwFJ2faT;$k^yu.}CO#TU*N@Tam>ғqmTq@Ho8?磽&cy遼8ThhLCW-nETޛsW946 [ݨ~v~C;e#[zM.Gtb-nS~vS6yv+v$jG謀sKnv P)Dw ]y %j@)uD7UD~.g"؀x땣u7{{M^Wr bzF=?޾]-Q7hzo0c/E֔^~0!`'qkƚʾO=;N'9\+މ`ֱ1*F+V[._-ո~CKV_)iaϋCݴ0<j>@g:R!YݦJl! ho,12ɰ1ׇIX>LhL]ZݦCV^hh7P=S6kvvybAQX"hw@ @g,n[9A3X ٍ7ytﮬxWuBI'7w݌6 ɉO@b M(h\J>"Kr+^ng$Z r67ߧ6\}-*୫\!= 1T4!p#SoH@Z5\h /TBTMݝT?}쓌pCڢٲ7|/?)P\1}<@&RkiÁj*t)K1q GaZԛ$5TT JԣA(cM8Χ';)knM]ݿݹ+,#,zNhs8Pfu\7}g?0U8f}ݽ{5\j_ nOjEmkN͐ P ܒU{r)W)~Oi)Dn-{ A1AFCF=oH$ 0*U0i1PR&"ÿ?dx<̓](&)IRG9x$vbHN XaZMUrSn*تT`MI,_j@}~[o)ۀ߷K-- S}ٖrS 6fPT@t0gXY T[Ć 5|?|-96ƒfzAAsNX?]T6J93]mϫ}wz?򊚝bSH^HAѪh#7KvYڄ@q.jjKa15zO-U ۿ*, ´=\N=:?o ù$Pɀ{9hK 'D+!ŁJhUFA1[ԉYegݓ#YQ&Ň+*Mѭ'ku*<#3MrTLرcRY!fof, !)&;\ !Yxxʘ-º4j=|E3Qfhk7iCnGyeHVW.N}rMq+cKY)ȅXcq!u]ECR|kgpQr`N7G.+on8XQtﵒ ~PKP#@G]xkMgŦ *4Ⱦ (TX^խz^Kh"*(tgͫ Z-v?xf-RCVrI5=>\Ar$-3(%Qϒ 1-S2MĖ)z˶d S`ƈpq5Ly8>>lP٫ak^@owV8(9L*{Ndpz~`!{]2뻝U=n~DF(HG]+=aG mUzV?*dSE [Bsz qio<}Tipd{ HXf}uپSTZczgjWէń?:~!uB%uQ~YJ[qbjw~Yys !rRNeL[2,aZؓ#Zt{S'ط Es"-x8wQcvKřiWCJ?r!○C 7Wؗ2/o*V)fR'V-eAKvDDfHDd"jR זkY'Ak3d.%u=DV?GfTRxDu=b )nDXghs6K63Po:_Pj-oY~#k;婨!%OCE5<Ɔ:oQYF @8Bc hK^1W\(A<s- SJ܂D$ArrFoh(Ydsd}WB횗sGd;o+;J<3bgȔ>ǔW2ƫwG@C{elm-cި_۾ʃzi&%GIT!cªlÙY{5u i5/G eq'vӥ쎒j8CXk:ϱ'e8:sH c@09;^Sr}8ݫ-=G=׀aQyXߕ$g(ѹI+jR4A9C̏iEO)juB<5/;WQcVө [;J?tͰHBf׏_y\)Ag瞷y TӍpd #3[uN<]W3@ڙIn8ּl ~6U)9L0DfZc)d# !"A»5/"p]%GV!bև%W5c_|o{GsvץQDa|%S]< zLJ_G{0-!G_&f8Vm_} *A-߿]?zQ&h_>}7 "\7KXu,"/޷^X/_zb+-ՏMj挷N~.de7nPx6wmsG2r '7Qzw~rG٨G5;A=քQ72hw=4<% uN]31vY AzΎr T ]~yxKxRJL}~FfTNMx]Fw/[\ⷛ)E;0J%sO]HxŠ%+;b[8]t"MdM7E99E(l*mN,93NxIQAISCwSSv׉8/ܛ6^;+GM U7%oA,!^WgmoEOs|F2/!`ykv'A?;N'C\D~(ךž(1cgښ7v_qeo41UIeOکMr^ΦxQ,_"ynu)٦DʦD#YC*#Q$@n19 C>h63ܘngDkg}k̀ƾohaƖVRqAxX+$4ցoZsg ?pjtᨸcPv,@k' `h~B} iBHaβTЩsTL%|Th'f\avIa1dvRV0GF+]gE&QG壬2Ri BP6k*Jj# !ƷBHmR#N#'6xz>Y]w 6.ܠ~3Uz!kYWD:=z~uC =+g=u.[ ~OlAb B JyNlsI@U\Shk%1`1z.~7=h s&J!$,I-FAdUE:kieޛΎa }4WNjZZ¸ 㚞 3%yK! Rai;ǚI¸k[BvZw+\Zcmr5T*򎧷8qVP0~ACu:ĦP<s*nFS/OLlQRUwG@ +-q˩w ۃm`bsIvX lncZV}˥W~x}JXm]5?b]׼bK mL ֵIvӡE~uvi]+wN_/%G-HZmzҪ Nt;z)mh+PG 8~{??A~K^ck5/}trD3t/Ɏ{E`R=ĚbIԊ Gu+`&m! /*($IԊg"]1 Y^٣AM OlKI1 7'Vz1]P= !=RBjJO}vcev0X*X0ֳj'Ԯo&n }TǸc69Oq+[''Njs:,dKCd :Hu93;%8 p>gbc;6k c"I C19LDŽ 3u#ہy[ށ$AcyRٱ:$܂9Ra?cgQ@vB:-xM$N71Q;#oBGD|sP;v@_e<#"+gz#b fuxCB`Tk3N:Pb8&G15m'y%( }qˑL#CϳN__^=T#tI * J/*IZ@0qtiLH*?D4ː9WQLg6724/YU: 2/s25:}"(P2/c.3+<ʋ >mt3M 9U2":+__ޞ'Ncn_NoGNߦ)Cv37i5w+28: *Zߓ߳e*K0+a:eHX~Lɔק8~1,d2 4\_|ㄩA~1Avˬz6]xL|\+Ǘ3|+ HZ1qRLO|Gob{}#Vws{boyûȇ-tv=!HzBZKX*68; jH|s*Bz:BΔ >xxFYd/^cg)OvCaaP9 x)Pj9/-LܣEY*6iq=7}̃{M,Ξ8yNk+Lo'Ȯ%I~ll5Z}*oL.+fuz<.(7MxYpP%?/kObt}ezs2q=,]lIZ@d+]2%K>m@rׂ0Y1ByͦSf;,z3Y6.`ZsӪgҍYhơnjXt8A 8m"%1#=}V~P^[/AW9 OŒl +Ffxz]̓/WW_~ɻ?dvG'r5rRmW=fWnWg#m T_ϮxLRٲFI~`}۟ӯ?L?M)|͔J1(:,/DaJ.dpqDqYl2uWkza2ȏݟ:hrn3 FabVeBVIJ>3:O"V4%]'9PfTU."i`S̡P5hSk6ׄ: q-)bkܨwH1 1SQ&wܽ"ϔF)@isA} (oک;zBXPEsL sx1Q Q5ڭ1Q; ӻvfIi_ol>phX4KKi\|w _?7u0?ӛY,Gpz> Oo<='^Ro̬+\4MՇ⏱0BkCk{ǧwKRsn_V촨0o66Vykq7c48n0ZogwU :)5T 䊒eTƙ,17J7*חAbz=.y2*Gɇ?4:u駃tf SW+'ۛǙ+Wq|Z{[ޘ xfEpy :A')s o;uLՔZigYĞ~s'ռsOly19N)Ƶ#G9w4߂T?6QE=Qu&Xkl0C>@a}>i$tҾm`6 %(8prcOX)n/Ftyʹ(hBu$}Ztr>e|{~x <1``m]|[ L'l7Xn^, LK>~eFlZĀ1X%yv }via&S7du]eތժhQ|Jw$^Bl}r(3c(ԃ`IP1[eyZٗVoԜZ*? :R9"Y+m-+x䇸>YT4 ĵaRWg,A2rHn'7㪹E㙏FQ5:ʧ~:Dw[M{tˠ ˛oqku#d*O}8tUQKSJ>TՎCQ-ᙦd DL҅%29,5&9O{JR9)-("^r<)V<9K2Kdsσ6 'sf챏[f[f `unɟ3sV<㥨kgm.ןOiTM JTA%;iBRwU !Y2}]%!z1fEGtu=f&6[Ύ2f@Πc5 O϶gJݦ$/=i2ZkzaB0ޱ)\r -EVfL=nOkמJS~w*}'[(H'uq{zrsSu\dO.o96YB< r!^RwbT D*dn^aK?HT'bJ69ʎK5HY=s~Xy)5d$me,-Yl\\ZE wFC+Aq!Z  #5$u{ᑡmtXܾ,ڧhovU]8u/|Qs4aBTeq(X(ꪲ}'mC+?_/pѡ e} -'UWAǀAyph8#?jBy,fZg_՚FpܩQD؈(_ ݷSR9"RZwi{,{53h}jڑV8(Z$8B5ɡKH%76gf^xn? Ti:6 Dy8E:/&X]ϟLqq8@.^p^ґ ^f+8 ljǏ_߿u< iZ0{:y,"|~ ڃng!#LAWܺ-;x*aw8SJQLJ]:3~n滩DreG( ˿5~mqٟo>">bY'`1II\8Q=-d2[7%.s闥aG$mMl3󯻦5xkn;]ضIK#h|8W8p%hԬf-~ZK|pv_/E\A6%j-:'t9* o!( 7Gכct$Tv-f =]Pji켺շOGxvӕa.䒞^O)5[pDSqk󩺨KY]mbu#'p '{g  7c#RbĴ&j4W:mzrND 4uxEC*yKvQ.*M), yttH" K49J%ŵ1`W8(;bPD6/<`Iⱂ)/4?^B9+3v"ѵa) NLfjqG0gws"N)"I-9V"\!ᘵzZcz?B,/|8cm,Ϧw{M~ o^UiN5pҿxxn}0DwQ_nh)FW YPv}?H?40OƟYC*%hhyIuo?Rsg U]xLBaAɳK>!@~3ӓ*ɥM#9q$u[-o=?ZTOz95SIlyrJ-U'qS+덝2x]~A-4WxFT.ieE3vګ膐~eٲB=&?+!|I ]±42(̡^vu%/XZ+cDY(BK &JM%M̊pP6VHzoȻRvK.uᅧ.5'P ؿR,ߦ:\,o.UۯKn{?]]?˛|Zzl/]l޻x[r>;;tgaҽ:a3pNgZt)s왔x;EqePʸa,lY`pa53Q"QUi'çvTO)yjWߚ_TUwNPIֿN9e=O%ɥ8n$_]gfyaCgB7oӿe:! ddEp>_5b2W3 FuMڲ)Ht7W?>x\nd͏"%>~?Ql^wЄaA9|P% 6+[$cSI(:K(cQFp9LOB*җ,DtLS '($jy $ǰ;0,ǰ]N?OH%f+Qe.:s cD܇'⨕i|p0 ݢS`:-(ʢ= |¸n인yB;̃B:y#O8FbB7O_Һ#H P,bF1s.>0,4F(\">A& laK "52pr"qlUĢ $I XcU- oWCqn )$y*$ !@b#'nAQaA GPIHM㰍cֈDA@F@T[" D-WJg[O@ KƁr) ̃ɋ-Dl@hZ4bՎC?`Ýɋ6&\!3Jx-(C% 6NS#<.SX h%Vs嗙 B˲m>/)iG8꥞Wjl:g?N@/C AhqimMH$(J0SVQœ{s6u6""<#cg8ͻYdҕmԫP:mf81B!$8T.] kEB'9Q>+WPbRYIQTly6 Ea!@a2DA٤J dl&%ٕxXlj0nE%e0˹n%2Y7LJlfnˆ\o^XJtyܯrTMɅm]S"v4Wo'rG~ZTks֧~%Sm#_R+uI5'"\̽rqD`@6Oq2BYg5>V'`҇y N 2q +=H`!ZL#t16G]Sъ_~LJ-rH˭c6;ϿNJ[7c0ގFwW@6~0M͎`60We%ҋA[$[t|&j7Unng gU7<IbiI_'.DjOgK<}$sZj J ţ<OD8V+6:-_B;faO8#*ч̬gNu7aRr8螝YFiWyf!3}y92>I"͹ ljT!b6knYY%Ȩ;Ơеo~>0r8zV^)ɗH:q wtou$&wo~Ytw E\OLӗ1w˖;FZg{~0_C>|of: ́66eo_r#O|3DXrn~5(V Z`UPU_;Q0PR9a%Z{~|j̄>2Z"iF(~M8 M4%RV23WM_t/M&ݗLM/;!8ӥxȎ>= GN#9K.[5Qsx(z&6O,'7pq~`{aqB/zv`Y2="_Bۉ@WB.-*M  m' e>bl r%LoJӯT"߷+#0HjjY,!Խ'(C Lp~ *@ǣcdvـ6g?g#MT2-|7Hód2t69i_P;L@ l +몁0)j52Q> }I#8u:nFp(%ƴ%zhj"OO?ʩ^5oӴh4.ZCfM)ElXz皘Cd5%_'ҹhD%TV ZFMj04!#6NtS| QUc]7Pހ=RRhT"PFJͤ Mon]I Ԣ)a宆v/?|0Н*5] DыPryG8*{ 7+u6~XHkI Z< 6 '[1Jhq(`Fy)l$HК{ek,Z~Vj=Vaz<-`lD.bBH)7(VMXskZdŭXm;<0fּ׳Ե*ƙ} n?۳2u͸3=4ŏafmFEe(BRsW(/1^!ۮ]kuJm*x Ŭ'b['m_c}YtǯC}:}<$ZJL%"i`]5a6ʝIch|e/Us2PP_N_֠/tScFyS;MX 1 5x \7alW"(u>[ Ic}t .\7'bvQ{ul5(>>:-"g6ԄN9!~C2;f>*7S$ua:05~5E"Vk">dU8"wp+s 9 ΏifaPnL!0@rmQ7\kZ}B$ͻ-qzk<  u8PI=Q3ܛLG3L!+an<ݵ,kyMNJC,-q-$k6K$`.4||DD0ɴiswQ:hx;T80?0ka'ޚ3AHGT8nw*Ai- ^umLY\ :|BEZ|}C) 5bNl$J"6ocfS8,Xm9HCPd@7+ؖIdXCrXtHYJ #4$MMzu(~dy.vve=u"j4?k zCEbh!fEAEU.KbLZ) iȞJ(]0ZFP!Ƃע G$MLiQ)&R40i63 1gu$dpe 2/ I5['{/pR v.62ƽH Kc.Ʋ$ˡ̺Fvvt>l4|SHءEQyF%F2"۰xFn\̓f5Cj0o!-Y LM Py Pǔ]6h ,PXv@NRj}3ݎ]juLk&!-ѰYh>|n(>[,AK\ѠZ$b) a.Jd;X@*|:\xZ^%Dq CL6ާXZL@|d[g{($s8قmHL` ;= PcmɅ1T*֪UIj7.S#&:7c%0ѱe~ֳ 1èxBh3fО Y,ͺl7"aKplUdʟkIgmdztA wkHluMDҌ &d?ٮO`W8i vkP;:2xA~HVx b!nVT*(@#ׁ.IXBH>d-|;arqSMحvG_$|a/Ȭ3#%49 6B @G8c8PZ2: Xa#Z"-'бCb S6W Wc!l3V?|U @7?{Aߨf+evX:[} | FO2-Nv'*D$M.d豴 m&CM#1$ۛ%?0f-}r^DJDk [ACQ# ˦$$-RȇQڜ0eQE3͵`-`nM#jisXә=TyiOv-ZB*m-"gK "_RTK'Eٚep17ّb2 rqDNgk4?ӉWg%#(,6/$R$j)+H 8wᩥ ODcmDiCP塀AS >ͧTR2/(m##AXDᓈqQ9ab}S,a0 *`"-AnDi:+wWNO<4`JYwG^$t"nfSְ(M^l$id< Z*aطQ2.u4 mz:/Db?0<oDPaVD0SrF*#2[j62 ZVci:5hk8斚(]N'Yw|İZ"ay!p. @%1uZ: U8JiajU$:~2/^^w:S`am?y F+7V뫪'?wmH-O8z(bz!ǟcHD5ձ)ep崦xl7j3q Ӱ5G)X = [ntI`_b (4ѷ׿uT80n^[l>܋8@^ Y/ƶ H 9ʔaǚxXOlj mtT.Lʲ4DAhB.Ʀb)< 4}P އG wf%փ~w}/W.NgWAr;VoqG)ayZ%O)e/?|‡/PYC>|yKzAs -CH) ʄ 2s=a~~-w(p8 [ fr5CպugJn7PKV u~X]6*@ac[{㺺L#ű{Zر",ߔ/!˗P:f7,ؑ U?rQ[nnCme~?u7?GH3^>:Dn/ FQLک;؛ ̡Δp%&)ޘ2J9`}іN*aGPTr IFet &IJulqITABPG@TRʚY௔ШU*!T1{0}ZL'mOH&gMUp;2PtrrYZdU2'*ErBa'%k^d6 ̥8Gem1 rB#talh{6?*}g b)k9zحԐ)K`S:~bk +ZelZ7}**AZ[r6l($sA}}9hѡl-]׀\W֫_.hl[aȷqgݥ:CZOkF0>fV$"!O"[`&ly'h[^d35t%$R#XEBMg{$I65lhKgG#ċB xS署#!bx; `sig]YW `×=("7ܛ|UU<-/7Qkm<袼I6$n?$V؁|%x{D`uΒX~.`:_,95(R`{j~ i`0D`x>R5>enkoXgjr^ Yxo턆ob#|ƫeMq'di[CGz P/aVk S/8d{Nk YZ{$5N||io\MWz$JXmNĞtd~[[vt#ZG2>ƍPhon&5|\G,0$cebƿ- I,$=:»Vsf6=HJ59TO^,WP?kwO*; 3omO'b?y/?]z\kYʟy6W2h--wz6FGHuDOӃ}O3N77g?]|bʥ}0x͗7^BM%fec7sT>,)7tLډ=Qo& lft2T cW8AFm?>' Q>qKk- |#֢.+jY65+~J^.7zzy&7VnUzJoLQA[OoLi$^^b>֓Sҩ*ֱ)ѹFf*AIi㰮o a{dGᲮ-PG 7Dppf)貐DAyR 2&_ȅP!tutI|E%$yyѪl[/9_wQ6.X(E 2hS&O8O (/IM=ڿ~gcX+ 7W]ul8nn@KEPATV`WV|r"]P;^WY C+r헢ԦwVY=f@S wKOJD~*9!TNY_ﺺ|{(ͼq5aNlLdWll(u,] ;g5}K H-KЮ4^PP ct# s~Oc^\7_Aif!NحVUwnasqsV?QoZeUoo?;e&9?/geƝHeN,QUU6۳rk̚·SAu TX;cQU/_qv>%9֑||)|$}Ӎvha >]24QFa#vxƍ2vv>Q01ؙ#w,ln2}2ddh pǀzpvZ$$]zq~` ub}E]o7缃}ε&yAkmY!zWE8=@N(޼m$- X.YrEq~fI](YIE[4E>l4__zw)Q] ɾN;;SwYk7neR?Udbu²u0X64lq.K r@K˸@.1B0Ne4q;Mlrk@t ;@2ż"J)SH-3fĊsut6)d[Q2^+i| n=Jies&\p`ZpETE(Ș"uٍ/?dU>S?V ;$;Aya%쁏 ݼAHZ{#7CtZ Vtpድ UvQj98U5/YDʲ$c_'Oظg/zM;uƮstA1?~Ď?4Hn.JOml&pQ 3ǷYAh4 $(;=04&OsٛoNOށ=L'/Ɠi@1\¿ |՝ՇH~_ &>L?=\ ˿=/}޺dh<0+_n(|T=cV]8 Fɕ?n޵{jKio5p6hQ'/[LƗ,K_\yw5I~pK|0ݬ07?({6MCr1x8peh`R7x 'v_AU>z߀@伇^f1'Pq3\%kg)>ΟP\7gefFBٚѐG˷k)c߹D  4!( aH"fAC- #p 0qYsZC,V2ރ4S ){G'ٳ5I'}7OnFt9M˧28df*ҧ,OU+I0.#N| %KrK%‚z+l8:%2*%/^٧Z,Õӹ֧x`CߎZP(X#+,dJE!MtbqcBM;koX˜s{=#"k0Q#tӥE1>o"b' JҘk*NbX o:" Nd[Y]d)37F@NZ%FD餎\IΘޯ\5P 3(: ^>FHcDgTZҤQ(߈51zO@08"^RAOaKzb TXXcfc*Sm5A qMJs)`=\.ԟkjp?_[7"cJHҜ: -}rU rlTC؁-?ֽ5I~E$\^sWn垞V-cGdr)͞=4GIhPiz4充%B_wnxۑMy v;=ԑ;$w6|gඛwX2eqs1#^:gk$ ͔~^fXa@2 ֜0pqhvڙa8fºsO[1!8'lk9[ .`ޘqra@D6)`D `"!цyK\-{QjVB.o^(z5L?,b_sPRل4fݱXqXod/†{_8| Ο?߭F٩ F~؇ެq`ͮf?ܹ}{zq>/o6(_ 7| {Ղ ގ[\=;3߬]^h&(o>=[o/&kPg>`tuehGj8}  ֿ _ƣi9BR:ȕuURg9?ҕ=_n\FMPJiMiP^!^!̰}|yxr1}cTC`Y(Sfq[8{$c.Kaj9NB¡F`QʅNT֒q6gEХ³u:7%w9S䦾a.Di. a_]GhRƮ]gO[Ȯ[\][ =EJ_oT"AcvH#!E$Ğ/2 K֮dKn-YB^vFd%.+Aej}ĭ-ڒ;ؒəzڒ-ڒ|lj&[lIED03wq,61+l*_-ddId&?7[lIYF(a nB}*$9QPBv_- ]+~LxdyYdB ],1VTK8!83dHS&&YON`TbF4Qc1" %CORB}H9*ٞZ Fi}oJg-JsCZBj)MH ZB*-NUWaÈkZfE%L e6ڄD:xs`QGo3ngQ zNXSCNp6+ֵn3ݚhک2 dBޜŤyryoY7DP'pZ*G(} ^ =C;Xͅ{8NH4%e%XB/9NOWU0P m] Z?q@0K.`]W(P '.1>vz8:mXڭvaÚ6Oڲ G1`y52`dIŞ| U_%%o6%pq a@kĀN)C.0C_ƝIAz q)ɇx۹LlQSkyτӧ͗g\N*v3WP4$YU6NBg~kc=++uwz G`{ t> ld=yJ(ًg{Q8buע]!eiqt'SZstn%nʧm:+8'✦[20Chد+NF¿ _әr,jeΙm%ew>OX'M<ȓ!e< =hi"h|ufd4\Na <Ţ*Q t|z4 =i"^ʬVRݬ1??헾N= զjB} u9W{jj @]r `ʤ0-C^^rEy 2 42_*Z|~lsZC~U- =fU`;,1Jqy񏢱0\p6݁HٛΊ}^YNqfƀ$Kro9g ʉ+wϣz6y1:?oI'пvxw֢>w 9ErcFZ:<3Jݼ-M=[ռ[Z1c\hߌR7ockaңc+};US!rdsCpփ) V!u@lH)&7IM8rSw1@qF I{gtwyiF>mnC0ʕctҌw|=j\tJn~,wiA =aw+v8TkIFnµ[["%~~zOL0An7LoxP~I=# :1q<5(duڒѦ68{xkM}8Hh>AX A{f@8@^q b`^HACh ymY=u ͉)J{DʇXvʑ#}ABi O__nv+WJ +(eHDp#aZ'i!yJUX4?K7P]dcR!c 27 xd|qKiaW>޽i`2psiBUhZ&`_j?:ù)-T#ҼQtQtNxOpXw2 ]C0GC>۠}6胷E}0{ HsehB,,JT4IrJ~g:ENJbOi$"(zWiD^84^]۩~evV]cR9-5{v3=f~q헯JD.McS&$j-3 "g< 0L[ʼn7I*4TVZ6Ea5Yaoӥtk]ai.&$] "J 2Dx f 0[D6KG(1]LU`0`V,H)@P̰&FZI+%@PAOX3M=*)=XX 6E-*uB :DNj8 Eރ!Bس5zY"#,#FY"o]&9#M &0&h~Xx%h `C}%$tr6KAw f!&CoȫYxneKx20GȈgVJ3y" `L{q=ؿ.뚿0҆R-Tpuz{K>??_<z⓫4Lmw] }ex71z`8AS%J$@!cFB,U/D0US(l{laEbScMzLM̔Mt8; 4d iѨ`Ll^M(fNMa4rm)}Pm8kL"Fð #CEDJ 1A+baskrR;cac.\k<I.\3rdwm͍JCF*?8U6g'd6)I3>-ǒ':}dS7 $Hv4[oԿP5+ >+fQ ;"D8*H$49_׬TeiƲ (F(3:/aP8< +ˌ%[:"GvH$5; >;gYQ[\!Um{giBZhgLA*% έ(>V$>I]hh̲RHeP5kIVl[G^cji'%Р.?RP6 ?SR7'ICW&Z0Úgg9MT/i0;<]Y\3+fq,g'O8ɍ}yY%)87!\r{3Ejq{g+Q|NJxAq}R\~X^pZb>{;~n/N $"y |fk2ywJOS&ξ^֡L2'xV) jT^|%jt1K9ny& l{GA쎼*ɱp P #+ ϿGȀڴ:8aFKλAQB]cC0>dYzr=y6!zN+`-~a4k܎S0)#~~uw] Ed٠Zݥ~v=ɥd)ED+] KԁϾ 'OOv}z~z}Gziw#S.BR3ՆU`&_?r?S*6,?aLg}:|9Ĕ$KR,1)@R `6MVFk!Oޚ e"!#X3+OFHf:b*c8U*PRP v^[45ZIѪύ?l)Em㻓j }`B'L$Nz|&GVhݻfO(as8LߣQq@JdHQ*`Ip673S?z$_(muq{@@jaIԶ _E˱'@~V1:둂jEƺ8o9ivHşeF) oy k < ;-@Zly\B"3iDM,HI 787M,OS6#b=v ʨS\ᡷɕ.! o p mlGgoagv dM(rz%@q kG@ڳ: Tݝ#i93% vw&tiP#ʭQAT G |$.5`ǟ{b+_!fXKds$_6Pb(4P!p> Y[n&YzGha =+1+\B n6z~*̓\ 8ۜ5Ѩ͗} Lvc9BQDaG "fZJ '"D)E'fiBs,}景 ?`Wʃ^y׫:k 0`]"1bDAi7dJyp| ĭW%i>|Sdpݫiqቻ$3J>_Ioey>`^m:f>EC:AJ CAsTe3K,'"W2-ulj %Y)9י,ʌetf[fBoXbZ1&& T2<l8b*pTed8Q 刓U6ZQ!S6L]5PfL|=?0^ʌTC469ĜkRtL eȌ|R-d&'4X@F{FQӗeY8R a8get9+]5I6Ul[Yg^2%)4{CBI9ɗ„)\5eieatD@ ^x_.wl#.8Ӡ(u3yq%ӗ?2?E𥥼|"hx^I1>i:%";jw{vg(~|Eݕ"WyQ%\1/^q37%?*f5US;?)Į#gmŒQ(Wf~w?jQ`NuʚWg 'X;ڰk<&O#bA|j(L((CEIhTRԡcqe@2g&'`6![0.mYψ`u?-;wSxF|c_fԾ@NZYAAzn#-CML: P ֘Q`cINֲDKZNLH"ՉEj]!g{"xk!ۊ{ȵ\?8mAjܒҡ6d }xǟQuu. k9i{[!6y%q駻tvOϗ_RbQQu\Ctqq(︒( aeWEr.FҴF* znuTi~S [+8Xs+f=:)Ps-0A<\Z; vjR_' ӀydԨ 0vB cLkeA`/c6BquVu}GI%y=TsPߞx4v"UNPm덨:; Y+"ׂ |ٛi\xZCkXrCRt(] dJ!K-d .y~ZQͽ yVJ:vbb'+>503XXHZrr)W@ִzڕ=γ[z;:N!WGgbη~It,lEVw>{ƄUx%֚2.߄Om$]",YIz;,rdjLAbr;J* r\(T FY%7Ŵcn>~Ă]l؇$%-Z.%i zx 82%OI;յq۠*4 k|8uFLarVV탄p4țhAW` hn7nƾKS$C9Ut?Oxp1 ϘppBY' !-ٍ;A€&qkC𶹲+XAv9r#T%[*KL= J2gL`x^MHmEm T1pH\dU)Pb-Cl UNs(3nPف 5W*eҦSX[EYޮ3A[prLTY ZtꊴLЧdt$a24,Jk[qx5yىRb.i&Ly+Fs,-M,Tghb-2.JgFTeY03I]4M^pR-2,3ލcft>RoҼ8.d[5y&TH f1\尣D)z)FHLNFhGkBk Uj 5$MԦ9G.s8<,>o_PW.j9^y>ft%O|˷UY o>.s!+ Kqado "|CLL 8b??ya:[(ǪB┾I~ܤ׋7B=mycm?%'_Qx0`h<=-??s)-%{1zrjŘCX}G6-wNkiB&aSFk])Prg'YgvQMiJ6˒A7rz!jd:h.UEBlb \]10T4` Ɇp$vdh)MU1(h@ꗞEP @~U鯐pe tĠ?cXi:f7QS29 8RpSHR[L!bk}+2Jj; aڬ{zSF ocA񎷯I54A^DvՀ-ڧnwZ YU8޵6r+b˾iH/Ecg .AX $9ŖlZV_$O&LFvXU;Uׇ>ƀlܜOG>I>zC3#7yjli*Ökgh& Չ{| OOaW'{0v)F}agRj4֏f_j!bn׿OL? `j`^yl%\_.':[勦֭uLfA?~$Xo_6*kë p=@_|I5Y'%>dFthLp:`A025?pSa`O_\֋n[߹[͍zTA(˾9SEK%A%8G4蜩D LLGgxȲDɠ_e(˃ViBXY4"O&JstR&2o8cԵ v,6%AC L7(U"B.t$ #L1oV(  2h2ʞ^_?} v?\-&>BdGw[[^.}zrh%ٸX{%?.込ԨOݧ7_/wor6\Kޮ?rQ\eh?Tg"ݦ|m/!G<1X^X7zlP^%HPCX6wo~@kS,Z% >,WkĠ7qxpAqY/yE2E=/!StqgYU]sF 5O%s͕ejGBv?Z)/ Dΰ8ӟ_**K%si-Ib8'( d_&c9,)1Kc|=rNFk}cZ!z4oQ䘍(-GX;=by*bO.(P3&2 LB٠ƾC-a$)mŬNH|\ٶ$/.'U9+/"ņk{.%ҬigYh%ȿc$ X:&_Es&s2,`JȌw$J^zNrth`$۷˦zȪ}>E nd>jy]Ep3)L5sdg)y:& ٺ#vQp dTAdh/b F JH*.l P ?dGɋ1:w B(`5__ﳃ|Ȼ+.J~̑yBS-O/ΟV*py?">`J_Im͇X91r烱B%gLI}VIҏUПJc}DHtLG>#>duTv] 闿LJɦv轼g4W* Elu)rYH* Jb% *⛣u ri|FfY7-sw>kx۩{Ne.f6/|g zڻh,;1̚ѱ*`EekQÇm^\'_Tԅ[]zOY{O1P;R_- L"VF3sխN7Q|NKA3@B@\|fF Lc3"ƬDKxR&K'd块5^bPYfF2O=@_ &;B`tomYaRh~!XgDǍƺs AH2e۳mUJ{MT_0?CӲUpnQq)Fm":*P9BXyM+g$ZKfnr׬=۝.iT)kуh}'U<_ 1NLTA|e[hǍO.7@JTFbӜ"#CMvTa<͇+B?tGʄ"A(K(#V"Huh8H +%VEdHyRh/}ߞ蔍XzbP;:5b9A<0]#nti#$O.d\vjNwtn1&dڭ7m h/:;V,|Nwtn"2Otڭ7m hKP`.VݸW}S=I:bFGA{/ rր6!R 7M0lUu v"%5SP0GӜ뜩'$N#za{8seiLTiR棦I؊`9tKVZF1\~e9-{ֳtQsJF8P` 3z*œ5'Z"I(:3*:zQD>&m۩QFbRN#yW"Τaaf3וbOuӁJ+G'L"fv) CN{8@Jn ׿`qMo4{)s;-|ښn?]w AMDA'ʸcb6zfF*eVm&8XmYA=3 y;lz詹$ sF/J6әhr1,퍠+- w[Nyծ'CW!˫ɻ _FuQolڽREzCWiv 5E!֓SDd>a)JH&VȔRKi%9ZFDBk`GB4TfP&񫙐 vʏj&:g\~ E~(bi5I:[bZqڅi ƴ 8}x+@fJч h>b)&o(^ĔC3`#8](ȵJWhO^34 h/ Dhx %%dNVhԩP @IPO ! vRJy-P"4S(PU`Bt*=vh.*lVɣLfL389$KG18S$B㢪x22T\P)>wSazT9Md *bogZ.Vd&fGˆ]lyvߐFrs:e%6) :7|pTKZw&E>|}cާ6mh=(%ޡqC r6h,{lzGˆN =+y/dϱfc5LX6⌴P(9 Hޥ1n{G.08kBʣoQҝA=H2FL-ZQmz7k?V+;\} \ۊ$㹭BRzP*2q2zfvUlj 456~5vn_ȩY>({Zi~1oh\z\Me\șO$g7=瀤Qs7dK2[Bk)1 _\) >c%W5&iH 00jIBKQo?X[=?RD r X;%WE`$W=^Hqi`M,ʞ24]iځ4849+ޚ\ݯeDޝxSK1-ir Anʹ~H>-`ѕ|Vp$> Bxr?IwDv  ͂ %AQp7=cŒFh:Tn;US;3x /fYX&ۅ~%(|V.aI‚TX=zsV^q L$Q'WDO!nQg!yɱ)i73*Ja\8i'lzt[um4jډattl@}[Mμ?dz݃2L0 =|N$Ej6E.lo!} lT= "l% &+õȝ#FOz0yi6GAcgUyD?)} tav8Y==9OcF0܎ :3ON,EŞNt:e n4 ZSvW L?1|cV&tڄ )9+ba7}7LlrVŷ?SZ^0NH.[7`ˠpKs$wND&Y;^i'X9IeIX΁%)Xr< {Xb,*WwPmUf7lXS}^#,dӉ5F~cUŔm/MF(=b6* a[w ڔ(FF ;*lfZ{ZX&i 5;ZI5Y'%>dFt8p0iPv!* qNS;&^%m$3z8SN!xW5ʹiq|MB@όIH/ۤ"*kmekN5Q' HFh~~93"F 3ԵTA׺&p;fA Җvh>8]X ,o['B'->ov d%] 63۰s޶)O/\ʝ @×߀ (Jm5 ƬePv#ZH y[6ax[>İowU`X]wpAqyq`a& vF[[)Fxxv_3?sR{/92ZF_^bۗu3X +;}Z ;0U;qu%sPF .@-o}v6r7/ʟ#f'vV;EŝZ]?3}"$帺<'eA<|J #LؑrvZ]z}BLo?wB2y] b@!z&S+s-xG~Pfsit:VMf1+X4䖇!Kf<|҄41 }ͧwwż,k+qvn d^8][mU9Y]3ycU6!]1丒ih~p̧1F|i1jGJ",*vH6N[j,DHZmJ=B8 LKPvvݹ7n"DV?39Rr|, Ugڏ42@6kAŚ>Nk'F텣3eov5qGNg]_oV"8 aZaD+<& ҸBp+A!dF#hweV8bB1`LkBOxb h-75?|GN!ALgML@3BEu"|`=\CB4a]+|4Ӗ0laFci^$肖)DL9FUR갴Fc YikC{\[Z?=|Vy`Y6!H<~YJoޟ-.@&p&a1?ޗ~-,;wrvJgOH!"B8L[৷'7sH\J|^d3tN.uI専E7~=|b+0DAP/(MFNj\d^5:HD %?z~[}-lq*Z8Nyr8~Zbfr&Zp1c[Z$="IGȱfMuh '6Ҷ[6˒>V犗=rPZ<6W:{G0myDmL ><;5"E"~$A~ޝ { ^ܠN /ݫҙpiE.C/Aå<<.IA5Pc}TƖ4<\2Ae:ɵv !JY8kf5c!FV:JKRH̔8cJfoWJ}{#ZD=NF,rNOgk/ LoB/ Z Uw\Oa2 KG" ŮjGyXpT3઎pEbŦgaTXGLpB )J$b^M}V3 ;VKutX!W\۲Mnmh+WѪNᘤ-As@LibW+4<^<}M;_K)+qZ8UALTHhw}#Zx*oIʥQi ܃=n:;!,i7۟ʒ,5ǀ]'9SZgB)d?5`sH ~|%L>+fus^~OY% SEY3/e J4C( #=g9jZ洑Ydlȡ$\qj(K [΂r1dzA?$.(АWuqAWFI8cބ6z߹%PWA杉\٪_LBPc.o]w9.Rb->̙:EiNI}Li"?Ո1^0X~]K@,i3dZ܇maTluZ6Ai].l\ 5f8`B\V:6IZ a;gVt5f D%$N@We$ꪎY;fJPb2eXUt#FG{{Y:jW^of-j^yLFC빏8u=y׭i:/->1dzeԤ]FC0Gtl9I'*&؋W ܵGٻ=VT~^_#FA Dبsw- :# :r>|n^VvN%|oCC^fTh{ƞuEu˕AQǺ ?b\Mѵuf4պ!\Et ܳnVZn2HQ>X!Wq؛uf4պ!\ETtd'U;GO=vcϮ8!tA$YXp%?;8X|N(HiSE쟿.7LKbe拿C=)y:7wu.G/.CUEd+ny90Jɋ8鍻)}R~;ϟŒMiHbaZ(z9ʭ!i$UАژWp_Rكi8>$Rc9 ErFixeFT! iqA'mB*W{MdAd}O *OL*fyQr"7w֘(q1X[wC/q5 a ||@}/0˿\x(gE8&*3A$L31Ge|b>_勜!`"?19FmRNpŷ픓XVnUfl)S1|i1fjy ' 0|x 1TRNgBB2oϲ ,ҋ ytkJy5eR&jSmy*rVtcM]>M#\O*dj' B[',mS5ph6op2Mbr=Y6Viu&^"iHaGrTc)/Qq@\8M@فC] T$X9M<9>Ǻ[Ѹ4tEѸӄ>!v%mˍ$b")9j%b7ngYGkzj`mYqyFK; ;( Hsu`3Np6^]"KzT@UN՚ `0LQSuWT4?lԝoglcQ(޻oWBd͑@;0dbf[ ,=R4!SIح!8.}jvmSp>54 %7am{нvf-ԅPkl5v\BV.քnJ$E"G]`y,P^k޷P|j)bSc0"Vb J,Xm-T!m҈쬪P { c( 66|,co5 #g󂧏궏ې؞xhbLCutugr~tr3}ˋFQ{\,xJ>}$k16{݃ajnF` ɜKy{u޼~|I{t0W~G/'ONSNL?ɻr6%NJkvD 8@ A󃙺w/J69\\,.Y1hBQ\(cd8B5\OeY}M?=No2e?_Xgt+ڤY.bﹰjl?GQֲkbẗ́q`m@:_R gHtⲪq+-].bm:e\3kĥZhSEN`׈(ɵRA?UjUy/ݸ7L|(\P*lJX2,PNqP SUڽnq<|x)6qee2?Zc> ]vtlrpЁVbѴfL6j:h'i8\% 3ܯ׍^^zQkqqaI21u5F|2!Nck+rگ2a `>IʦW~ݍD:!ʹK+&-\Xi$Np<ؐD2xڦ!Қ4p!*$LT"/7Z[% &`q=JT9΄[Nq}&Vln9]Ο @t{eہo+>><MI}fx/-؇^1/LRؖ?~QE(5h8ޟ(MbU#`|7kAw&Sc!5 Ie%@0@|f'O\)ms쑗1ϖ$7Ao:'. /lg=r:U jCѲ$]+HuEUaesT(Y_yc m1P7ޝps؛Ͷ SGX/!ɚwܮi_:X*X2_*0U*c*OwG69SF6һ1hFˮ Wo;Jb}4kmQ2q;ʏqw9)V~NvZ8&֖ VuLc[)֮ێD.w1u0'!/їկF<|.h\.wgrVs՟װ켓7Sj3xAEl9\pzx\$Z2u$ ݚ8-?TW-Ə*ګ"+i dqGYʵ~z]pnD9#͞PATup-ڻ:{00`[[?m;(`LkBIzlI;.6˧_.f7nbXjOM N@1U q``*^lBnCR WH&Zp9(9Q ǡ_M8}|= ]O}.j8>IR Gs+Ϗƽㅂ0{kJe4L6W*8(c]geCqIͻ= H6`en=x ?RBT{z" -e["௤XI\RTEBӾ7?t֨3wM ?j菓Ň#O^,^CVhb yFE9EtHewr䷪+lOtL99(Ep }o'!4۳j2s̢Q&lai'e&J(4\3k\Gl~F/cٕdbk%Dl3Q>0w2YcC7ƀnMI ,4Kj`7ٔ|kgv+%"qC"uQ k vJҊvc8$mvuc06&v)c%ËNk<^huBk 'atUC-mHͅz܆b;6.bT$Cfv> f@h0PJsNd(r0@,w{Gqdxl;Խ4UYN٘'z:na8zme7 M-Ĭ8FJ`IbDsR[SHTI3Hɫ:wїnxE ǟ:&~#wgGgW`H@MI2AЧ 05Sx x=HMFrvdo=LݨCFB Lt`Vyt0t6Zg6nhxy]`mHJKkd&#d?O{:o^?3۟/`wy0>d{5e^w]GIe^AHB I>t=yg]R@4uL*"AR03,UqX *P m7Ș~^x^AW%ߟoF`<Ϡ/&0{܌ LQD /fGW3smw*¸&]2}3 >8fC ;B")V(c'X=% k 0F )X?.A7MđO7ٙ'o{39l^oR˯h*D ]Xeɯl&<D~afqW2 N~T5Q_3Φa8F"UʹKbRi64.Lsc!/0j4Q4#N`*(B Y򚌀k}2X\r}܂:ӧCg;UL6=p$% /!+16Ѣ@cT8ʹa .Fh<筗2RLUfcgNڡȇF 7ZW$\Q)h{9RekO*LhpáR]v:UQG)AZMƷ ]TMIѽfB ̝^*HbZIEp#e F' kds3FaƫݜԻ`r>Zi8P@91{,^*s OUwWbT)8Z(Zo-֭}X[K\۔k|0*u|3Zft:sXQev 5@Y: $s}P:KACw ­BYH% `V Dq&,Xp#,HQB"FOш`X!|:PU"2ȁ'5Nʭ|QdW&>1ة A󚍍kAu;/{ƀ~Tr{u%dN.$c|0QKB~450X(bkx\.93EWpۨ7_S jϦ5bN /9CBd^aS-p {0{aRp {a BsICK罰9";۸?;]w^ShAmHCrP%,!r8H[S*Ya& -nߞj.2RA(I.cSe I 0aĈDFspwT+5 _ޢZ[cį m9DEٰ(Q^f㑍1Ac8=15 .;Z}ǜ{puw1A-7y={G{\v;nx4?l(Zwd[*OJLX*3A.uaǽo4fjðD/D9/yFcJɓ߽d1[C * 逸ݎ#dZw֔m#MƔnnM12tv;JiƌVn!?m4=mKi5ĠlţLr8ۭ)m|6S|v\˓n 14tv;^Ziwkhe&cJ r{!kX,>oB-V4M|(nI}G -w5LwY5ſ___ġs?1ݯ:X'8%\]0 p;F('z*1~k=|?zc_yݫk͛t'V8)ճ~ݏ86k]8MGv[|ۤiZOSww@cj#g@$|VQ۶OS+LC ᢫*%Rx"%eRl03,Qq}`8V^Pνipr O]i<1wY7=L? }w=Y!gh< ղ/KMJ^~R3?= n%qMY4<(s%/_:4 5IΗMz"%+x7 -7ڪUNjv/jձA* ,AtX>85c峚cC*@t^Ng_݄gdce"pat>8_$6*0*ٵK0fPZ9±G:Flu<LLpM^{yCD\+5Ar/ .TpEp+"N(%X2,yD(e15=X i(.D5S @ bjy$ SUVD gd.Kkiu\tUcXcd:TVsjY[fVG%(wLk!g* %;H)HUA#Hm2SOTh3R l~?6C^2œxd&3o qZ-KLZJ\Y" x tn&RI|F@5CSKj"# Mտ2nW(wZ|GgT7v UK &TQr+&6j(-غZXJxt_B"B3AjOf%Uy۱M##1 #7aG=ƛ^[^93!K \cSZRy<q:F'\_`]^10a>7gK)Cf%T܆16[bc"৤XIǴKc0 #7Rkvq,XL)`NSIfql8)[77x{ V<鋺Q J^*&jY5~#[IMR]deL^HRl῰`P+0F$(cS q%C= _ʍ6b @c5r{[t5OaUQu)A?yKt lҮp5@wHQ(Ti;RN:GW'»K}{yWb麗o6۷76.7FW<ٞ|+ e`-[;2$\3ZOѪVu݈$K`^F?=-Jذ 25jaMC pqVgp$|*4Q`1p LM`SaSVܴ| h _4#KI X3M <nzCP &> X!+`V!o#a]ۜ>2jlya֔Ҷ{BjZeh>FFi V4&BW*ؑ% JGH A0Z1oLD7Xw͓7{X9Z YDFj"֤pw1"\>ʡҦ7kgBlvpQ#$ټ5md^.$)IuL:s\Yw/\C[Qz7z0E~&Eg2q\~&]tΕwl|%,0*2o|.Uvl&)LփGM[X %kc5űNpǣs.W eL~g⻠i2W&Hc\f'W3ɤi 9 i6p^, I,{|D0s]L 2^*5S xdab Lx|R:KJPkBkrqׄ?N]&ɿƤ_\}qۚF$K*ZҴ;֋V{6!w7́hncQ.L;.Lz܈q)$処zY 4wA*#]r*٦\cJ1֪wal ZB =x E؜yCppi/?O&S:k)ByGpZr \Dmc Gr;ƌűHS%I 81DaֵK)s\Tv3R= 4TyWxIaYAmcG$Qm;f<Hm *\t{31nvK;BH$5{>21ZG$C$ #; i̕IK$5?UOCt2^zl|4BhA&[m#YAOk< ƈuIΌNj&)%oM:ȯR]]U7Sa;w=*.G?\wBDtxBQۜQU Kz158z+ C!bD*%UX&HJ{1_ۚ(_p(c$aP/3{ʡf>\Nyϰ 71d'齧i_f1f9ʌab;ceS FH@Jy :q"/OVPru8[ ʩz1]y!C/;[xpe7,HB|ٴFlZYS/E 'M;QXsŞfueW'z!%!s9%ʊOׄo+*Yk7N(i]HӠ-SKN W廧=oXM_xD|NՃ԰=V2Y%55UNIQtca3 {z!UC48P:3n!u27ezzH 5'P'B=y=#ʒ9$Gr%cݪSO/DiȨV>`!9xA!/f7WWǾ: t\X2@ 8>Ws2e=RrSi!\kڹZ@1ZRiƟvhd8Syޕ(>I\UIV;Urg;ϝyy4-_C 5 3F2vf,+0#ք kWvm΍V$ V>Gj e\;rrocD-'d5Q&ER2жH'uvq(=n/["*"Z=gB g,f^ͪgsrC̒[h[c6ʠžE[>Tm!rKmUt/"IlN?q_ =uòvD̑`iqc㐙ܛ&PHnp<+6;A@ UƅLQl7oR=bfHL|xl?ͲD~Qs 5|qԻ% ١Vۮ-\ -c}νe3'ءtؾ9T7+L7=E(Ķ :X>!o=Dj%<ЊkEjX9@ Z߲O8^66HOzCZȃe%Yy2ITZ{6ͻ{ɕX8㟌A|K-H]=lZoTv'[啼%z=sK()eAf:Q)V[=#M(Ӵ4S=D\$T\ل5BgQaaqJ3@NGvߧ,Ūטql+5CbC@r0qh#ݡK,u>VmXh]A xUvc.`# ޶4mJlء8oWeڄ57a o`=KH,U\W?>1R{c1v䙕 ~&)yjb1&ctuR?lBh>*VSMfɓE!?}!a6"a;~!Q"JECw4Nyb8*Cd):Y;ʖo2-X9ugŸ~:{u+0W%5E(el.g9!t1s1Jj#ݝK'3CΩu{&O󷾚jLVQ2`?96Z*igϲq3yM) ֔j!Wz6g؜ רI1k[=թ4j~xuةlYFE[|c||!?(H9}=)CWw3;.eP g˨G<d.*ڻi45B=tsf ݌=n. ЪUэiMm㮊KֹwObEnjVDhƢSzh^O;ZSO}V _-D|cNwh"N}\O(k+%%5"(`YY;O&*;y1M+r_L55M5}=[?St%=1ݨ=N@=/E;)e!!51Ձ'*ۆq*1m$J2ROEU"$ 37Q&&F(B"IدfvX;\Vl iá0MH0\Xq&4!Akc' =N@Er 3pÂ" m?.(K'seGHIS*4Lnt g }0 K)l)ݧܻTĈ?1\Nf| 5IAĨJ~cS8z3WPHݼ{-;Ҧyk(sl ajޑywFjer}=EO~};Qfy!7?7 '6q0M*8VkC՞ݏ'cp=+K0<:?7zdМ#)z!Ӂ3vYjT*;' a;#:[q>>ucv7mwzZ^{>'LС3.7# `d[w|`{njpFq: b. VׂVǛpy6Ew|'J@1S{T`zF $t?4$-O<1 N)fu꾥tBG?8݁2)%ِծX½- W &M ]ns J帠(-Seso7ؠɏzb2BL!hQh)6r0N+Nkc c.X5NICY@gߔJ$ZgJJni0kA; ڣz#wNJ/jJQ֨Еu(]&_?{I!2e}ջ2P(̒ ~JjΑO N(ճIB;tHLbG*s7D)88L aI v۝H-=pTn"I,On½8ҧ2^if`:i3:R$.9t`!VϟPP:X`1sgRUn#Ua/F!69}&[S1c0YimN-׫$ؐ7K_dK2TYr0t]]k\2LP|*8t )$^h/;x.$gRpi:*:"BWfz-{%89ds@ޭۏ?KMp)@\b_Yn.2EVJʉ- `+{j 0]UZnCٰyӟ-ّjiN^Wu[⺻& ҵZF=|,Ss@%f}O>[gT}zeh9s@( EvM)dzRucN+-lPUzj5דh3LeQFWm(Kx3 6i❉H`^Ց\~ثw=9rɯ158ep nLWa{]$Ad:Щ昭~w)koJ.5TP>pKP!Zl@fxFz&_}Z`f3ib  ݡB $-RZl^bm֙f_Jg2rV>j[),g 7aʸesciaZƝxnEwQ[1p݉ eimW80afB$<.H1B3K`r@6(?5wլ*- f/OM/^wux9 7rM*HM\sj7G,8Q:%3k!@G(Uq2M4D-s`|щ3ߦr貼U~VY+<+C|Jf~=}m_ ?8뻜Q1ƘGX!`u.fΑ>9^DMs" ?2 @g][q%XVWh(2u3 FGE6ja=[wP ͵jfNlsF])"K-6\k! 񱒈m K.#P 4FW&ɕE3F"Րl;/~o#φl( ½;2h'PC@jc⽐?՛I*G]Uj{x0) ȟIjKә 7Vo<6L= \81 tB#Q@}>ml2m=* $  0՟\ ̑Ё،oI]!t\08p@3lFYH61C"u鄔0$ <7P/\7 dHdU@2ay0pȽa#e!<"5i((^]Ɯ+mhЧRXn^û'FOPEV)ixI=3ñHɮz>AoU3L 'ās#3[ґc+ ƘWzvvʯ 累%`|_uDz?̆o%0F|Tbk+8M-,C(T@ZCZR֪?V!iQp}zqo* R@&)'Ҏ=Qd'Ru W-+N>w:,rGb::W"Z:vqԠW(:WPYJiQ7',z2O mC0[tJEH2cMԦ匱. eI)o3(TVkKUJ]\]u/Sף$G="]ȠYiJ֟~_bFl+]KKH_]VKhNي0$J*Gۨ&PN9$qDmAL© 5.'SFt{ӫ`JDo+S;%e}Nh[Q= OZeDN)D[<b[Myϫn#|x=_?WPm3| d⫼vv=_+?sr,? 矉v%R2[tsn8gq|./Кf\h{!3L1B A^17 崁{RIg拀cЁwTVu?/XZu`z%Hui!i=z Bu[e6m "\q3X \Bg|wszpMYℂւ".d@*;F*(*yV`lS5eN1F#5,#oCP,PjrG59rc IVk 8!PMTypRi@t;$§2$p7 ؞0I7jgh @Cy7Ͻ2\J"`ڣdAU/5Ҁ,+Ly#m`N>BGrQLaaB2/鷌/;UG!էo" IzWmāNpKИ3X[QA qe 9R@RGJsOAZH0)ZpǙ,pc\nb.֕뜊́/F]KcY890~kfx8?:[l9u#,&wlDrC?|ۏQ0z!w5ֿ&g6^ĈH2~uI878q\gl Q!Bޅ+*Je&p͏(kZGG$W ,(L_grK|tg}eWbѯAt2Go5DbY'w#8QSDnv_T`*)όlGohXG5MK50r2FW ~#g?Zwt*yCƱ܌>?~N\VbV0D.wg)s*RRFY}A.?k!O1sCӆBQ6]ȭYC?]<&󳔽ί?hv·4<fs9  Ã9e0)߮6<fz߮t;9 7䟴Xs fKgL␃d~LײSݡTbH&=.G6,vAa ϥqU9~}yά_G(Dm y?_>OGwK~bYHk՜}J0"[#x5m/թOGo&8>~FHK{ΥUK+ZZI*-^KOVZZI*-{\Ht~m ԒHZmѽ[d<%TڕĸO6Jjt~2"jqh=dFniqIJ?6?`Kjt~2(F5٪<(qKq L5#lp [W~ȏ02\@~3m Xeط"fQ,L>ޮ}cWѻL@1ixK@8ـSpm^;-ʝQ w @Qm H| v FCNA{^/_w @upz&8aA3S6z!;bx ow "TMY8%~% 2 Gxi :|rFͧ$-xS^!x`: .UxcP\yR X뜔Kb]5\UL +FY٢ϠZV=š1Wk^b ceedvEXIngkD$wB%ʨ[B[Q`d N9/f&GG+ZˆʱC@Niw!D(,,ᰲ+dYp`,0:XA51E`0YKJoT_ɯׂ{Qlzfls9<5P $ayf.i"us94aȉn@dLEm!qk tpxiBV2P E`T[C5CXۣ^v Ma 1#RӡP{O͞):9֊ s9B+!wćBpd ,9))c`]YŐ60\Ȃr› 2c0/ d'D CP,$KU4B:A=0)DawU:~k b!f)*o/_XQǮ#_^BzК~rj)[2.w-\^Uɴfx..v4/W@ܒ(?ȝyq7r-eZ٧XNhu]:¬r>0&OwʪJGҞ*_TDԒp)A೪]u Eurcݎ)L+wnٌZ64䅫hNJ;MJyL떉AdQ˺/ WtnrMn-h W,ܝeݲo,4Za"=!Wa: j\ f]\z⨣ݠN^zM:*EP! a`NWJB^JL l8^0%hh)LA6F9W`L`  )iFAX ޾]P*׏bsgEJ[01zkCC^FhdǺ)m`r1HQ9X#F Sҟuf4պ!/\EtG  bL cb#fk_hrւp)fSͫfT{c#|BI<oj쬓p?^ !@* (gP%E5DH$R Dm00FASJF$fP-1zrCRػ i[?sZ'O[`+{x-6K:rŬ}00䙒B՗9oHX֒k>ߦI78L3v`L0.c%9 r,@qo5\ٙ ƩZ4 )A4 !Ά1a7E@d A(B+$LI%xX|h%qe4c $V }:C˷ۛ}eR?0ǀ!tq9́y}{ޔzV5$!TMa=n2f{~2]KuLr1I+uKG];<%N XAK9k<!*<nj:ũR|}FJ( N1κwep)"4?]{zs4M끣Y1N]mO1)STz%Ǫe=|/mɧx_hyb-3֦)S~!yJFzWwջ AJr56$7띧\l`ʂS N ܫƀ%j.3ST`95<F`u`"NݰPdPMP8aJ`Eq Ӕ&&AHJ0+,kKϡ20+XrtXbdrj ޓbrh;Gs#b%ybxG~U4K8'ܵnJV.I2u'x ȴR;&lAC^nI)r}Фݑ'A~ІKdrFԁj#} Id'bE$"N%5bT##7׋戳7@G&#zxKVn*tKSPf{tBD1L~J#Bjq Oӱb.Q~j1>=sGVޣrGv8ѻ y|Vg% ގ6Gr.jZrz2'V^bd>V+6z{ )D%K,TO6vhg$|ŷgXݪImoܦ!:k!lP{ڵs"qL1r*Oowm=n Yw axHlNڊIv.CH3-dEvFmШɪXX,|_w~黛kF5!؀>$$}$&fHE7/f.t0]LyA _y(J?N)'a $3䱔WUH8B"TUӧ=i߃?l KH:URɞ(>:dOZ=Z)G]gROy%zpŋe::Q=p< p(;縆14{\$"фUl 5bc9_i\ "-AqϹZC k4G1qaQs}{3-6*4fp[<ҪB&zJ& ~q?ڨPζ=yVS! WDiÕH !UZuiu-q"5-\ {QE@p`\@ U:ބJ\6v>k⸳!y\v9~6[va~&!$%uU;\yBM=ӺxWl|sU/}{kҵfs꯾jdX}igI3{tKh Mxׄd[lx@*"WAO.jp$uت oޕ=.4Wdߪ׿ WR5we* "4ϝE?H=)3ms,gOaƷ_??3'Liö\O9)RueM:Žʹ/Pȸ_M#W#-v+M1Ib4$iI[S,`gۆc+878d5)MǬcm92OSX,\oSBMJz},Ţ[^)ÿ--DSW<͸jⲸ^ 7zl)VLn7v@Q ?uxM9Yᑷ0捃D뜃;ETcdco=^jxE38* k XM %kmfZQt*?<1\JDYQܾTVԅZIyOhw5z|+.ue. Nq$N&ѲdΟ5s6*)hԉm_3ORWYY/'fjF9 ^̿|$(%~ge-p:n֯q)0}-yW$M.OZ뻯_DDh⟋UtfR~;ׯ_߹?!rvɿ>-߾TKG\R.lNn,PQq|]A?(PRPQ[H+^e6$[Ne+eSR9mZI$`Jў'[$X苺Bq2pxfUNR77r,lq K%|#Nm9vߡ<$Gd31}yq1ӨԾ i|.nm:0'^O6vujwx*0K?$ 0jNyr3HI].=:W?);.8|Eofܬ|e5:1-A!YZ'`YWt>}X&'L0LIW7 zH`*"O>آIha dR<|ͿgĈQFL(¯=kgd57d `/8/:$d cG w$M`+e5w8rJ ?T&{lja1vՃMagW Cp^ƫ oltb-G/s ۙz?oq;1{,h/RJ'I'I'I'mn#Z)RyA *`$> 挔Z'*d玗T;^:tLǗ#IJM8ג+Y .A"?Q*@G$#ZSAa+3 bw*HWa较bUX:>/]xK3yK.'TP> yxS.{cQ7}!)ܙ2[/ϣZfe=+[o}e=R/K#rRq.p wHnFˎ*Zl'O ɊK$)!P=WvPKlgHT̜A^ʙ1'3V]L]K !<$R):NYjc# 2b!Es N9dWYUIv}D[4(Rw(DDgeWPDd|y%VG+_]ݿxutPr,n( c5-o,52P =]j!b8I~Zeg!cER.731Ɖ}e}AtAnѪMX+dV{|zi:[ϗo狻%#i:s4ad ]j57˖_-ht p}z|i CgaDarq5_}X *]Y A'Pʮ 0罩^tވ]<:BEV8JyoHFD)8=+8ՎhBm0 9Z= `8GD+*8I"ﹺ9)H {.E@9ug8q,{&.U,Ym⛥lEr4I,ZBFJ=AjC!x@# NIfߑ? 3t lXb'K5^P5@A*'k]ܠm( FTUNJ3vZ=sU[#mW"jC"xԡ]#\N$Uoʔ̔)y5ֺϾKbui=t\uԎVkOƼvPǮ$5Cmֻʭ=qpÿLu-ߊ[:meylJYD5؃jQF3`fo˛ :$a$a^ك.hG(䑔o~]@7V6E*&N%V&Zrӻ%1o]dn"hB1hM;5.2l(!Ab {!6L y }u.) .{6Es@CT KlTV6X=sZ9#lDyUDw$' c \Wt\ǵz)±ڹSf#oOs ~eZcDNad|ğywU@Uyb#%c}%L.Aofg<( yvb`VNytAV"s" 8О3*z&B* C8p0EaH5: 1o]*,9K"" ثg%yWu7q貀1IQ z5r̹@oj <(=n($?"BfO sP}g FSfI% dHL2dFGr&i)MR4i/v?HDXGƒQaS9~8gƁ9N ,d@E K?Xe4|~9;e \*m<Qc1#X*80#bh  0nD)Y׼ gIf@I,^i5 / QAiAK(w "b#C Ƃ-b#XTY9.˳QmP$((D䖁I0 K,N)'}YpOӼKRR)fq"yl5ZglTlTK+I?y^&ϼ.sA逑N'J{O>2ܳ߉] h?A2Zŋ0LGba2ѽюE icXuˆqhCю EXf`mHu Arڣ:%q_0g~yH,_T,ח &G哀1k:KT훝 b>*}Yco:XžH[l2H>ݧ|(֍@f2$T&iOeT&iOeSYKA83B ʵbiU`cl h%p(ed'T_8@nE2%8>Q^IR^'IN-8FiO)NDJbGHJا3ϝҖsh(T$ʥ, adLx2qqR/4 mDn3/4of3X;;VzRa"*$X7RQC<)Փ{;0) ëHfX=վI\PEr7B]@( Xj!:`0Zd "RͥRƷ(si%1"kH0ju)b^ER;9 ߎ+ 572[1(A=A3$cd\WqjM#HH3@,Dƅ 1j!ʑ"6Dr,s p$DQG?Ac餔QՐr94+x%UZGHUZ13 bBJ1)) ^pe$UAŰo1cD$CxH%Ӹ繢L`Ɓ!s䐍z4X 4gʚ_aec&(Gމqyg7mԐlM)b,/LdX!/=x (PJ2s8UAL)'IkJG~&KGRqt龜G4'8]>,HԔ^HHkJ'PGQIunɒTq $S%\f'Qb{O7K^zE&yKT[j%G\,Aޛ<%x{l"xIPU!m$xno1Sw`d&-syp_IWi= X.[\Iy\o2s+ f}P5֔}P?8h0S"|q1:qK3}s-߅pb˙!XӆXKݞ99ZLI5ؖ ƈC`Q]"n\<+@PrYתtмr}h\ԆX04T)H ATRbsSf7;=۹'f-NП&U 7;f+=ٞ5R Jo8WL>s&0C=-qpt[J± t|ҵmk̼h \p45=8`l T J`(% hDFD**6 FBqqp.买T{ۡ 8+ZɀTw[b(4Iut^3fs5XSHit,Kc-`h噙 żҘp?"% P[WZ2b0R V*F=CDGἢ uw8M%K~}=8@Nro,^K*R<=/,~ @`BH9LRFʰ&NNa86IY> 6đ4Pt1Qѽˆ8ΊhE^i*gk9 sDkX2Y!{@S"͵g"|Byws}ݣ^>i/aIǝ^r-Yj{X߭:1ϽnP3JmL{1F >+辳hf[O~~Zf`'W.4Y CY3((dQ㆝T.$t x:f1CNqe|\3kjǣ㻻{{,W3V[s?F C ?vY~u~@uXn! *imA6Ϗ?{:lEͤo4t{F)NZ;zp6& U{hP-GOx{T横D{h'v@-K18[Iy6zd+𨩖Gg/J`yCejtm2~ IBPNg4=9Sx Zͺe1EV[{ql=tpƞ5薑JJy:>lCOw$``įCJw|c;Xa8`+LcRP FjܠOp4n@'@TkBSeCSU$0p:8'Z<(Flzu QD-[d$0+>,4ʭFǭ.Z jA*Ko  }׿o3s1 Eއ ]#:.TaX>|*V$ S<#,ofv;m"Kn>DwDrM/S,CFx2S)fH0Q<.ͭy\/ ZіNx² DqÅ:SqX0ګHXOrJn;Q%~׉lf1N+o ՋU>PN l0 QOٷbiZVC=0~|,Fk O/I\.' ۛWx ~Vm`"_BWWpOka J㹤6YZ~Q.bn.>dqヅ_ֿܫËT+k ,(EP[A"lT4iQԌKY:JRp'Kg(iIW2iT}_xsoң#i(ohg԰wzA.#̹䪌qI)*X$Zy:0٨>bm!wF] (C3qSKdbg$"H>" %e8#vRB;"Z6@"6E5r2AdS*E0*&N9cv[c*՗z?4MmKjd5MݐIjZ# V<@=biwILpaM4v-F RFj|t5Reo͆3ӗelֳ̓k5o;͟ +{0:cvµh@@),2 .Ï?-Ï.Jά;.T{E"^?p22AUj*iQXe Y;56їkmTkŻ z]aRVbW9f 2pWS% 4$*kF wy*?")z_ 2Ö\X\bKq\Ԍq~f gK[ F>rVg/BI6o['C4wKid.4`kVɉqXwA+ӌ*!K .LyO'~T!8Jn^@Bum$*$.ք0v3 ;GB" 4Ra;(?c9b )& 2c?Bj-p9 j'=.1;*imT.qnTnCF'QFeeΣr.&QF(dÎӹDضEQֳk59G̩)`Q9Rx/<| J,*WG$8E"D傣 £(բBXn qB!m_P2o,HRǴiӦ k1zG3O>(T_vUbRxl^Jpxa0mmL{*MTPIG&\T Q2#YKZVr. /eK =ցEj)eN&+(ldΫ(NTzđ_7_)}US( rUW{KUL? Z8^ L^)ZknXSd#)-mφp>Ǻ+@.(OmXc*sL(PS¶˜e&"j"J+;x[hEۯãņj}<] IƂ )TPRYe.4 u%i&ZFB @C8 dD[^p-ѳܐ3̼V:ݓoM5.=xa ~o!"XJq{˃UjBS!Hz$Z;ND%qA:HDGi1 TYa"x.'BHA/kp7  1Gc@`axI{O@A.w;!tǏ@] y!vF0&I"N+J{QER`SZc1'Z MBsg63`L*7 " T8BtT0ĞRed{'+z"J 0GE۟H5{փVQ$j鱆7K]J gb}*eK-3R[P`h#X%9/~RX{e`0$;zS¹T osL a`5rgw0Bס%r} -p7-&cluE-g*6 K[OCZ}4W"eWά>r,[&1eQxJp^x[a{+Vc\Ev6K]`Exj}zxlmD,Y!/׉95欿=oT:˙oR3o;{ {7:5k"?0FI4#۟LʿrG\Ybܷ.k13F}x40JǷp8QC,urܿ>95ʘa}[;a:9¨=o>D{xV)bMi|6/U!}D_!pr.ktFQsg1Wݲ;;&:;_t`=jYWi28*c_aȱZ(v6?oF׳O`wѫeމ 1rF׌>3e߾Lۏ|598:]tpC5D f<_w/Wr+WSq P˭Vpp8<"eR]Gq>1Ffd9J_Ybjv0O/5<੶WQM|#^sЩpX.ςˮDzi]ms*Ek+θu&N,6ROGRIŻ'Qmxovt0c)QQ N3i>vxIkHʒb\nyp8qų{z9?{JLFjRy%V?Yo@E ~ʳ`/hhgC0l tuCap&@T.7B,/ ,ҍ"_`]0;KdF^L<4;Uf\4=-Sdak[FQ&=з?sڶo-+ؾ˗!b EAHEϡrmhT#j6`qdᖷ7agO70kY"yjʇCuaўg5aD5'yDT]Mdߓhk$1PO&oe!lQ'D^!P8'ksÔߞ/dxˆbY?[?Pn~ӇuVD+B"7oPr]C$~Z!qek7aLHLX0H#]ƑBr]`c9܆^-ݑJr.^Oe3_,#?*Dl| Sx p/ ?]T~&-_;’*8gx) .ym%OKVojWKYZ0?ᱷ*Lyj5hZČiK-5Rgx+aH7TLsUƸ8#- xB PfC@e Qo0Q%a9r! KVrqO$)UFh:!gia0F\bbTQkcrʱPJg (%aw gD{j²Df&Qc%T5md>g^{Gew sȆh B9 AP#)uB0F`o LXA:= C@&3d8C*8JJ["Rm+ (ҹ cc ,Vi4 W!fe hm1;|.2\n3-dDg Nx k.5B6܋G^ ]/z-"@ww!ׯ7wS@;4r0_~4LI~`+Ϥ)k(`|R,J :%@Iù"p%0=ν= Ssw8~ BIsF{{;lO;|{ރAӹ26u ef .'{lFZ{xйuOWX-/D-/Ƌ)'Gˋ`GˋdGˋGˋ˩6Ƌ:^^JsWwzbБŬW//fxLɣc|sJ~b^2\IaI,[VFX`g`[Cm<^1W/r~yhy#ˋ=^`/-/x(Gˋ~ w=/ɼ=pJ2j]ffj}{Lk0 hbU?,kH" 6L 0qa #D898D`/Yp7DzMDzJME2drP.#u΄!Fpc ?us  < &U T!T'r)(W^-o&19vZ%FE&a8x>]Sf~yq^8|^O!VП/%9h~9KsRr3#13ܳ6\;8" [,gR{C YD>C\VYW;e2 .WY)Q$3sVI%#ss2i6.hb݌ϷFMsPh{a=ߏOon;PlDiufiZKk݇./GofqY,Gs2~DT`nYV 퉵|辫|W[* EA4B2#Z΀|b4Wu翞G:`7u0gX-E'k!0m>j,ٷS\F}GSKqv65i=ء|94u+ipNƣ&`JK5xmf+*q|ݳXbZN)fg\ȋ P|x>sKp\,&2e#GbM#g,rV8r*Slg-Q)kGA }Fyiّ[xPECC\EtCu㥮_=ZP N3X|= J2[x`ECC\ECtJ۲n|ѺbPEuBQǺg-dny[ 9r ѩ'}Mʃ8bPEuBQǺ*;n[ 9r ҩc|[MJbPEuBQǺ*;nwu*Zr*S\|.uSTCnTQgԱnwL!Yx[ 9r )F>ghۺhB1:Ϩcj4mECC\EtJ-ܲnnTJ& |F|E r][n4U4H=m릔>A1[yZykfTn4U4H8= GM)=\ SJ -|d):@] RݍmVəeSl "D ;Z`r+Sneԕ FzAqKyx1WAM?E)g+l] oQեc*fHM JS.H?5:S0fe7pyS vZcN1+!cfTcN1+a93s1G] Ř9V$ŘS9Ji>c̜)b)u%>c\T%Řc{AS9ŘcA^YTb)\G^Y"S9ŘcV13,[L1c] Ř8ŘS9JP ^)Ɯb1VkgY)S9Ř0Ë1rcN1+AӦx}Ƙb8ŘS9JPR.LE,ŘS9J*ݍJ"=c$R1S:'1ĘA =Ƙ)qs1G] Ř)"cN1+Aj63%HcN1+c#I1c޷tE'U.~"Vz}?݌Oo~;w]d^(l zq9zR|-N5d$?tn8 ˵0&Dk$&,cc rËwfiCX 6\sRHLUQIeX٤(7ӈkiƑGNm@(+3OX4FR`lQw(gP!@uّ@9>,;읶LzI}Vbs0qZr+p$ :ɱ/\;!,gW ]=`rk11(gHad!B_p4X[࠭ErA@t1S,b(rm:-BUr B0q8 iLfTP8jvIĈx!E J76^ͮnŊɷW|f4\ 6o V!Lh N4f*wOѧ!*py7t[ZeF^yءLfUs\vA9` A2ݹ9b|Ը S-!8.. ј C W)ڳk5_uyꂘ;f،nBy+m~'?8/r{Uk]M0=C!pFq {jE䚨sPVBv5k"rX2*zw` uHk4!(xϘ˄&SYwBy|7ZxLY,ƛٸAgd&# xkWt~`4O+ŕYkb97wQa`BK冸l=of9]7}+9p=#xŏ~Y/7gc :y$r'1rc4l ?œW;\T_nwmKHKzї? NfTixYU?ƭϔT#ĖwU$Q `ĹlZedP`CXh_*S.cFx9fF{eN(6NaQ19(3Vo=ߏOonsJfi%^|ofqY 9y?"h1' >oО0<*ͧmc!pC\Z8~LqCڙ\4`BfӘ._m r|gX2* ~=zyĽ_,I|釦ɼvJB)אַCohF*D~(~3[I2tQnB2 vF]]?:hAiE=z?W)Gߔ`%ʷy KyQ\.4˚m{4u0ƣ?n*X=492]"sDMan,,:F <¤l4eIF1EUĻQњuSLE_.͙e\MƗgl:F !JM);_ 6eI+;JKͥ-k'$߯ʿL,}:l gvpDŽwR?ߔ7U\}&NEY)y DLIy"*JL[rH$UFS!F^b,ZReyE/_@`+T濬ozO(4~~LCL^h3lVNFT2R> ?|pD50Lo:Mi]Wk%eEd<(ˮ5e,+V?Z }~}pDp{ (ױCdB=T)3xoV\=Z'vp1EbѶv6$A^cI"IԂ9E+vth6~Zllvck/?3=EFX,v@GkʡhZbHpr w&YnMq-bؕhs ODYۛ6PºtY=- 2\H]N.r xu=Ty Ƨf9]jjYDׅEQwE\yGQӕ=a2iX5C.sĪWwb_X:h!WQ;ܚdWԖOv2ZA0EOm9X?;"b^\Pg=!нH{٥@#{k 'P~9+I 댠Qs:z'wVKLL i]Y뉰Ro:#q5éfD*ydr4wrNA!AF\DE,UsE TT$py@'c!BD%e5RE)D)ʎ|VY&6יT!(X˛ yc 9kd2orLuQ-y@ kHml@  H"zn޲vLƠKծ'}d k?BL<ؑhy# DrC9d\ūp;/ l6_-# Tb ْg4D{rձBX+EF88jhX$;XnK@映Op[Peq д=PlJJN6Yag!&{s9 y𶉺 ;9RUY!$h5>=&B^$Ysf</rxxa]Ͷ|IX7__([f!g7߻="}elyU~܇rF̀s(j$gj=^,MZ>lo{DrBO;_0rU7窜=%*_)œ($A4@ .9}"G*cQ,j"aĩP>U &$!v\u؇PY X6tWk7ΕIJJ:.h$腈$Z/)TxgF)y0R${eS@xT1ٔ &GbJw խR2i孃`Cqʮ%YMel,Sl(ЖT€Mx'"Z(k8SF1$nHQAk^1k&yB!G1'WlelEF{Gm'd0$Ae#uyYNP)b%S(۬r9@ZsmY_e=klޖj$1opMϞ]>;hiR#-;<:?^f}EFj)׎NYQ^C΄#}5i땷ZF;=1wO>1Tpe:"ocɒtr0Qfl\ z(Gto z͚߱zJ¯I3ytᯆɟE ]ooX\j}CSoрSi![DL7=z54j~4e`U@Sp낃",A􆥵*vCQV9/FuUiy~Fw߀ zc?YXo]k0MD H6Cz1~UgzDTrǃ#g{ AJQvpP}mA"+OSʍQHJhAi_;Gv/I&P!Fz-"uJ -$wJtyx&Pk_)ó$o?l"i  U))!l@ p$`֢BC=/=N*بMP*x!l)l5FT϶ye{)J%$) dly9qT`J)B!⏅'eXʖy$1$@*dl<үQ\WPcg9 B=IVo{|pӑ=|wO D 'TزV~1!zǾϾt)IY3QBz3wRƫxq6DK\u!E4ir)SP%'dA z1^m$Ňpi }to> a~M7f>_Ocȿ 5/ PpN&?s\)F 7 ˨,UEjIaB{̢mLLz4Ep;0G1|̏3S~>YHd~v,Cw+̉ s3$=M1q64n9(BckTcgoAN0A<h5$ F[kJZNOk]#*[OR F5f AE]-5i|α.f`U#@~1SV:ͅҜ=G;'*6!a'T64W /CBq-'DD{," .&Rl3!U!47k'^`4 #NY@2rT'2Ei,X㔴G}ʺ agN 9B@eg3ژf$|5fue  #A9)al4T1>hbjH2 @="7 d2@үF|]2„6BKmw˪d:{>qz:fvBj結ޙ|>gm 0gpԀ!F w<<{lY~1zk -[\CXe(Qx>3q=J@=hz2`XIA5@u &tv$[-Jm5pv MyRF7s| -7㚾AT Lo-;ᣜ0pDS\\BU~SZТMr[7?- t oA '֑,~clRP|TFlArրnEAn/ TN{J%YT v8eV|!j M-R$V{k$PRx}2*BKݚF#mafcoyzhf),̐Páb+v ( R16A Jf)rҀtHS8 Gw:`O|C ~n^1RS:K;N1C.t!2å&ϖ4F.=;q\yh?sp+B9K8$܆xС(Tg*~'.:{ORepX K))d[ %Ine.Fݶ}Cn+<0p] x{;pꢹQVe={1.hAŀ5|O}a}u@2R(}[uܜe_Fٟye x0O}v~YcGCvE1m7b i-|s]YoI+_Ay홗F.T=Y"eetٰ%TeD|qdFfܬn%]tF6K&&'[//jap^˹kʹkZ-Z=gt GL,HP0iF}HD Q`OD~KR"t{5z??ׂj]Rň!! ԏB9X,1G}eϳh~K,q_1NC~JĮ>54j>1uMS᰾Z=uegՖҜ| h|(F:pa 1]^%\Wڗ:ZpNti8M2#A"23(%%!CMhi ].5J~3*A|Ez"*XМ[- zO7ɹ?~TƵZ"u4&i.-S =bR!Z:5\/K7n<:_ޅdmux[:+C#BQI@1^SŭcG `]&ȝV>R0kRO )noeK-Ik";֘cMj2M9"c\MT)\Yt;k6s=GM H9Q8ze 2.ܚr*tG(繮!XWOǧi%]c8bM-2F.ScRwD꺒/ӻpWf(`Ͽ^~=¯8S ?|Y΍[6}w `xOm#&)ߘn\i?ʟrG/9=sN9~/hdέ!}nCq"\ b} AB'%mD;!c2xslf"謯](" 9PAl`o=/tLF7)ʾS0}^fBI _럅؁1.:BrV횙~0Y)9TB*V8i):ÔIPp%.<ew3k+rI`hGm3z򅁯鉫j:Mj:Mj:Mj:i+5Q[ Q!.U3$@.*Ĩpaj [M,dW5Qu]hɹ&>1cy rs3("-M\Ó#HiZr#'X&GH& ﱧ a0Z2ϢZ'B4zf|z9`9ARM0˛jgvt]%!|,kc}Zz=MC,hUW>W\\bI^[nd͸cW}[ qo4@@3D.('@5A|I5R@"Zk!"bD>a0oJ`xEa8') x O)(c f `dS3WL Wf l_3}UfD j}.m&0"(@)œ5ԏqM)Hbr uO أ ~$ym[v3=IyLV 0^al"jTL ?رJV:JD9B{#bS\>jDƂQ&M%ok+Z 咬jzDo;;QF2AJA$CZxaJ`ѹJF;~$e\@`Ÿ=ZeZ PBU ^}'XU]%|gQfJd%N/B$'A/.9xAxi^ZKxt+{^v:G @IkZ)0\mÙry'("bJY9̽  5:7>ɭ2t'#SpǺ<}J^ f&i AOfariY@NձSYBbcQ Re۩U%[O#C UֳSU{ ܵS4S 2MWŪEi^*_B֩YmWbi7ݘfr) ߍ{=A>ObxkY.1G2kyRٕcu-s0 M&zȣEX)R.y6Ty!`=02f]pi/u}c+u۝^\ca)eyL "= r{M<. ԈR,5>08a_CgE#]qŒ]{لdZW1 (Ym3/^Dy0w'XJ|LlQ1{r>yX4\(֫s6~Fl5s˔& .qԯ-MHO`DΎжP*9- _m*3oA$/uzM; M\[*2:GT"gМBn2eg35Ma/RO:4T{yϖt5ei[U󋖧d7u9R2tU 49!jIG~#"䔓n5p/\dM)B͑/X*6:iƾpkˀ`?|l>]ym>"X4U?E I0;fi:t\LpmŬ]:4EI)5_6{rh>::c1u語zBi7-7ЇwbtvgػVp9\DCV;~CIVP2*ћH汆ʪX޼PFu> S1 5S(dcvŰ~ ຘKòm>2"zra/ +JV)hڥa!C,rچUuѐDžS&[e(YRu=5M6RX"jR\^z0u F@:ᮡ*ZLKy01T4tZ) ?ĕ#g}5fKsZj p&FH1"^ER5u*eVE!KiTrV(:I(E MuL&F&ᔆ(5&a^$θ`cL XB1-jt{gxvg x5 mNmדԖMZo;?[!yDȒ(̟K_iݤ2vZנ).z !QVNpZ&ҘYP+^>yE쇰q=^!3-,t&e8jf#M6c>Z!3JPᘍZ|$*} q@`X]<Ooj1R0mtA ~Fcmu2qٺvt@[ O82 ۧe'vPI:_ӘpfZI}%VƕL)@;IfCDjR޴Q `TWdр >_~7Z;hBXR8e6!\dD7np Q.i* S,~4we[`c|`j dM.MIDv :e&g݁@87覵Q3TH; &(3]Y+?+sFz Bqdu(˞N0f}:_k8R1-;>MjŪ Ql .$:CF-\32]Lt>a|eqY6LMt_0ŹBLaUQsa셉<( !8 #BiYiapq9N!=^*(h%JN4qu@r/FIƔ8eI#r#4 CsM[i_#-F5jgF+룒.Hy;~UK*%BȼK&F&ۙ // a,Sj4Lj4L9'조Gh ޺ B "ؑh4<`)qQD b/%WW98e:t5_1 zZ>!P/r&T<[ތf[A3UN P|՞{ZH*#":fb mVS%x # Bi~ ªͪL)ɜuI6f=n]_ [EvK,1B&1hPF UP@sVMQ Q$$՟\rANJ,$MS˸~葦Rſ?EG_}\,ml (|-x5ߞq֩9r"u3τdGn7~A@&߀|^yS$+tz-*T3y;) ˬ)`B Ht LZyS«v ў+5avoc\Z3IZ gyZ5 Igixc0RWTeB&ϔbT0U@*!hQ}ӤӤӤӪ6VյtkJc nMЎF: &;-Hd&HI Mv8R4E5ZTo:~dnS=AgY{ɑ+| \dH"r-r/|z#GKld[3] dgE=褄d."X6F#Ѥ\l-D~CɄ"=hDal$Q[ *)ҮV7"f bBޓchIПziB %1b u[բ1&e dS(ɘG9Z m8"-&jJcŔ;`Z>Sqhg6>vX!}HzZ-cԏ}]L }9TR%uF {ڮXjc#}t~Y;)&N!؊j=| x}^77v2ۤa;a՞1&@+mc9-1LHi ^%$%o =\nib)z A=vjtHqN@W$iUvmW5jM$s[96lBXG+]mW͟٤UbkDȂ7XhjMmYG1IvAWk;u^{#wD|e^ofӎqvA4&R=M* \CW l:J'7v16Z2R$i9͑@iQo  &{!^Hr+\_8X(fqCw_f0a<5ZRQˮrw.eTC]5︫x>Oq(xGqƛx-!h5Yع^/iiii_>T h'T€  J`.@RG(:H ƾhT!ؗ/ɇlh9B,ЋIՒ4KŤ :*?=ie8;vPs_\_I>`x0sk#YO&wa?x`noƓV8+зcJš8X\,$%S0H-Q9 Di϶Z u%aCI`X6{ՏO~v`+ Nn=p$TIn\俳j?~Ddu9oTsz=ϓ8;[c_N>=<*~w0_ߏ_~rstN>uv+)O5^fy6-e/ݷ#F X[8[bͽo1MwZ z>sznp%ZrOFc7JBx_7rv(s/?džt~}>؁ZsR=]l6_rӹ+jߟd_g?W}  |'Tv:μy?;'.*948lYK% y @;pu1N{D1=z̧2zŇil,QI뚭[|pl!<|Pc^[ 4>$g/sfsuu|tvvef)y8c^~^i\A|Q8jhN#K?FD2x#Ď2ѫ"O:pR>!p(>z,oySQ>ܱ?deUcmP]n)Dӭz!#{7,2!SQ%Pѿ h/¡5_&i0s5HGHqM Uzo<$v>M35pBaFo3RM[̔@A[$d}琲Mй8IE|3)5C2.[KH|1P"cTvo~ ^2'ܸl0hJFbN i*J31_ʹCUbcO1!XN5ۼERU [r1k,zYYI1x *ziw؊߃{j55ߋ' wONͮp A=|ay<\.YkWע%Ҙ3 a}0Z82G#J$5&ѫSu6jú$ E_j[lcBɧ +1-%6%-bɀ!#`KHva( YTg`b+h.1QN8c9xcK!E[N%*vs$`g%CFrl.|2XPG!X rBpBR'%d:%d%ZBx >X?[2?u;Go"+#$H6Xf(x {"ŒJL0k;;b  01+{LŅ?kZ{e@ʘ!ջƶ^sʻɾh8Z(e̙ E/`"j{j@;vŚ Ah%x&QemRlPֲC`}J^QA*;T1v m2F>|zC1a6' "C~~ 6#C=T&"Dݱ!*]w`BĮἊ!q,Xk|R+ї }`j%67~ҴERWjM#f"tvӖBG] c^h4mif9JlZIj ~^чξ_dcpݗkpy]sy1 W4} .L<>_]/w/Cٓ C;AGh I iM"_n2 .6v/];sx FDիsJ:)8nޛ'a=xÛ JXH㻒Xձm:-LmW0:ZLKw0֥l:ѡ8ճfuY}`wJ>:Myߒ+t[f5)Kl.2#,94b<(X.Ds?]]LY'xjOdDZstPk~,!8m$Ԟ#ф0L z#DY5&a+8{N^Ev/hd.(2) d(Ħ;J葩 $)k}dj۸.ǜJtAHnewp8g{rcB+PБf a a'b@ `$)@-a U N ā3" E2IO&?`. *-\t>AM HvKv!M#Ecp]HFT ":,j;SIm!q-w/ekGKI^7̛DL srKZ%jSp]飋i>;2vhTɨfѸXwXvj,*;·R6Fl@]ϯ [4ce_PfDZtT秧)ԫ7swb0O'FTz١yCs}Qkz2_]_~5YSX4[G#ު$^=[ڻQ+`Kirߏ,;--k*gve5b&5;1z?]pj; 0db˝6'eNW4{`eN}Kp>N :r-Dl:JBv>[q; 4{AVu#-Q]1𝈰& ւfJQ}ڽƹV8(4St{h bHӡ@#ɗ :X/ ~M!Y~n$!/ tl줮LgQŗ=|Odfa^68Z _Gh֕`&xoP`Ɓ{dl,!}~~7)̰Ԏ]Ւ¦=qtpOz;%৙{*~lSN"m]l)$1>z6.z(eA9:[O rv&8~qI b4&X"䗽 < }GRvA&(]a$"OiפR(/TB6D #_5ל E asdؤL.º&{I@b?b_D ~zNk=6MU#5m;*$ݳ}8-8KmNrnNoٿM8jH}2d~/6NM=v+ #"v~h!i;*/l{ͣ!+'4\JaՖ ѧCй#dmY䓜Z~}~o +bcHX{d6|~CQ̦A#HmV#;خw#o/bQޗ@w4iicOŅ;OLGv8@j"h:t1۠@ T~[/kY~̶]J*#Np4h;Q1Xq~3"Z;*xa8v[BڹjWM;=M}mq%KoZi-sMQʨـFn8eڶ[sΊ 4mstݓ.@ZdiR]vRQHT/.瀮Xἲ43]:zE:krmha}n"&j,%H42YIdB.Ys]mo#~B2Y|7$rwH! i+IMbKZM6[=;Xn5UbUPPJ6%8e\N'D:5jERY!@3*xY;f-m(Acd>E#THSԧCB {nKWW>c_1ƧUq!rX'ʷdGýz@h5q]4wgz?yu{3:ӛ0pKi>-ȸxZ?>oΘָ7Y)'ɶ{ru#? ]E ϲx6q܋j&[iSPЄ((q#3f̀pe[(TrY- cK1elo \ LMll_f괳Q.ѝ-9y,>D2z/]f&#vV嗀_|r0>]joYN6]=[rl~ '0B F+98}puop ˥LKd=G/] kѻh+NDŽmS☠O~ _/)ĩ.q<=S 6gfJ6z/qus+qba'3g>(Ypr} fGOR*U LɎ-ڜ|<do>s&y@ݣ&=j/&|(xKrwc5XՄIX Ftԇ䔟TkMȧՆ0mTPw{5X,_]sxW~eOW7{Ʈ@A;jaWG-x6S'Qj`S\rXWׂܟߟ7:dw=W!M' *S\}hݔԗ`Ry:cX3]4C[@և|pM)/qrޏU*ߛ l$o*R64N[gvn.'&4(~KQ2Lh/>NU܇S^IqC4?L~5znM'wܐIubϐi)/2Y5?{UWMܾ\Wr%3. 0 (yesIgɭ ߥ@S\SKOQpħjq BၽzIQ;%HVm?>3pʻK#/6y/s|LqLC/5y⍔\S6qCdԣp R@-VV`!sF^WC{E-*@+в$](oŇKQiEk$5#Ġj4j꽴  q7@R!'D LڤtS evla İ[P&F7vA>1岜񓌇7ڝx8a0P ITx;dG3tRG]GhV2N8ޱ FCjׯw&0Fށ{,nwɢNQQx(%d391lQwQᘃ?5\t@`Ry:F l.T'Ӈ|pM)~u[*BT'uvFwLY,@և|pm)>ѮŽ/]]CIɛ)4j֔ˈ2 z`uHJh5A90ӣ29f8׍ߣ2T) %SJ03biX\|]k%JJ/Tg@xa8ަ2lMe(W6aӣK8=c$jP'AFR9]tҌwLh;މ4(C=$62s*9L?welDSh&~Kby(x>Ds`S?KNd/V5aZVB!R(HL0W%Ћ' (Ж8FT,lV˔9Ƃ{АI:%/>nF_D_TN1|ޗ"4N\C[tP@և|pM).V,Ef/JMiOVvIHgY?!KE$Wp*"Tz{! NbO7@ šeRv-BQei5 V:" "Ȕf@fIR8}.9KdQPꤒ%` OPR!/4wy!B#A}U7$?7=u=e~?YZf85i -3Wj?2o𺗺zJ#OP$_$/ snӃH:anpk$hMP~D&@ :F쏤qMzn+2J8IgMӛdG$L=$ ̓H^M)ہbB{_ PYkiܖ(/}E z|46%ZxDq_^$k~u?Y.]rN#VA}A_{:m%Rww\ ^YXJw/;q-_k`?*\2€S1P"T*V"1jw*뷩Jqdv25ѣCwSF :^'3la{iۙ` map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 00:08:55 crc kubenswrapper[4870]: body: Mar 12 00:08:55 crc kubenswrapper[4870]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:42.776205087 +0000 UTC m=+13.379621427,LastTimestamp:2026-03-12 00:08:42.776205087 +0000 UTC m=+13.379621427,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 00:08:55 crc kubenswrapper[4870]: > Mar 12 00:08:55 crc kubenswrapper[4870]: E0312 00:08:55.357653 4870 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bef6b6179e107 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:42.776338695 +0000 UTC m=+13.379755035,LastTimestamp:2026-03-12 00:08:42.776338695 +0000 UTC m=+13.379755035,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:08:55 crc kubenswrapper[4870]: E0312 00:08:55.365051 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bef692193f819\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bef692193f819 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:33.114372121 +0000 UTC m=+3.717788451,LastTimestamp:2026-03-12 00:08:44.223889373 +0000 UTC m=+14.827305713,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:08:55 crc kubenswrapper[4870]: E0312 00:08:55.371711 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bef692de9e897\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bef692de9e897 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:33.321330839 +0000 UTC m=+3.924747159,LastTimestamp:2026-03-12 00:08:44.453163221 +0000 UTC m=+15.056579541,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:08:55 crc kubenswrapper[4870]: E0312 00:08:55.378300 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bef692ea1cba6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bef692ea1cba6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:33.333382054 +0000 UTC m=+3.936798374,LastTimestamp:2026-03-12 00:08:44.46442556 +0000 UTC m=+15.067841880,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:08:55 crc kubenswrapper[4870]: E0312 00:08:55.383671 4870 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 00:08:55 crc kubenswrapper[4870]: &Event{ObjectMeta:{kube-apiserver-crc.189bef6bd61d509c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 12 00:08:55 crc kubenswrapper[4870]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 00:08:55 crc kubenswrapper[4870]: Mar 12 00:08:55 crc kubenswrapper[4870]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:44.733206684 +0000 UTC m=+15.336623014,LastTimestamp:2026-03-12 00:08:44.733206684 +0000 UTC m=+15.336623014,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 00:08:55 crc kubenswrapper[4870]: > Mar 12 00:08:55 crc kubenswrapper[4870]: E0312 00:08:55.390449 4870 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bef6bd61df274 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:44.733248116 +0000 UTC m=+15.336664446,LastTimestamp:2026-03-12 00:08:44.733248116 +0000 UTC m=+15.336664446,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:08:55 crc kubenswrapper[4870]: E0312 00:08:55.397211 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bef6bd61d509c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 00:08:55 crc kubenswrapper[4870]: &Event{ObjectMeta:{kube-apiserver-crc.189bef6bd61d509c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 12 00:08:55 crc kubenswrapper[4870]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 00:08:55 crc kubenswrapper[4870]: Mar 12 00:08:55 crc kubenswrapper[4870]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:44.733206684 +0000 UTC m=+15.336623014,LastTimestamp:2026-03-12 00:08:44.739872204 +0000 UTC m=+15.343288524,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 00:08:55 crc kubenswrapper[4870]: > Mar 12 00:08:55 crc kubenswrapper[4870]: E0312 00:08:55.402124 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bef6bd61df274\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bef6bd61df274 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:44.733248116 +0000 UTC m=+15.336664446,LastTimestamp:2026-03-12 00:08:44.739914765 +0000 UTC m=+15.343331085,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:08:55 crc kubenswrapper[4870]: E0312 00:08:55.408207 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bef6b6177d71f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 00:08:55 crc kubenswrapper[4870]: &Event{ObjectMeta:{kube-controller-manager-crc.189bef6b6177d71f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 00:08:55 crc kubenswrapper[4870]: body: Mar 12 00:08:55 crc kubenswrapper[4870]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:42.776205087 +0000 UTC m=+13.379621427,LastTimestamp:2026-03-12 00:08:52.77752593 +0000 UTC m=+23.380942290,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 00:08:55 crc kubenswrapper[4870]: > Mar 12 00:08:55 crc kubenswrapper[4870]: E0312 00:08:55.412316 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bef6b6179e107\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bef6b6179e107 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:42.776338695 +0000 UTC m=+13.379755035,LastTimestamp:2026-03-12 00:08:52.777634813 +0000 UTC m=+23.381051173,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:08:56 crc kubenswrapper[4870]: I0312 00:08:56.044864 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:08:57 crc kubenswrapper[4870]: I0312 00:08:57.044795 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:08:58 crc kubenswrapper[4870]: I0312 00:08:58.045227 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:08:58 crc kubenswrapper[4870]: E0312 00:08:58.150489 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 00:08:58 crc kubenswrapper[4870]: I0312 00:08:58.161588 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:08:58 crc kubenswrapper[4870]: I0312 00:08:58.163239 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:08:58 crc kubenswrapper[4870]: I0312 00:08:58.163292 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:08:58 crc kubenswrapper[4870]: I0312 00:08:58.163309 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:08:58 crc kubenswrapper[4870]: I0312 00:08:58.163340 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 00:08:58 crc kubenswrapper[4870]: E0312 00:08:58.170436 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 00:08:58 crc kubenswrapper[4870]: W0312 00:08:58.751537 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 12 00:08:58 crc kubenswrapper[4870]: E0312 00:08:58.751598 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 00:08:59 crc kubenswrapper[4870]: I0312 00:08:59.045984 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:08:59 crc kubenswrapper[4870]: W0312 00:08:59.327694 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 12 00:08:59 crc kubenswrapper[4870]: E0312 00:08:59.327806 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 00:09:00 crc kubenswrapper[4870]: I0312 00:09:00.044716 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:00 crc kubenswrapper[4870]: E0312 00:09:00.194565 4870 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 00:09:00 crc kubenswrapper[4870]: W0312 00:09:00.702051 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 12 00:09:00 crc kubenswrapper[4870]: E0312 00:09:00.702105 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 12 00:09:01 crc kubenswrapper[4870]: I0312 00:09:01.044382 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:02 crc kubenswrapper[4870]: I0312 00:09:02.045605 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:02 crc kubenswrapper[4870]: I0312 00:09:02.776779 4870 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 00:09:02 crc kubenswrapper[4870]: I0312 00:09:02.776911 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 00:09:02 crc kubenswrapper[4870]: I0312 00:09:02.777003 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:09:02 crc kubenswrapper[4870]: I0312 00:09:02.777294 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:02 crc kubenswrapper[4870]: I0312 00:09:02.779086 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:02 crc kubenswrapper[4870]: I0312 00:09:02.779220 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:02 crc kubenswrapper[4870]: I0312 00:09:02.779249 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:02 crc kubenswrapper[4870]: I0312 00:09:02.780241 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 12 00:09:02 crc kubenswrapper[4870]: I0312 00:09:02.780698 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750" gracePeriod=30 Mar 12 00:09:02 crc kubenswrapper[4870]: E0312 00:09:02.785163 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bef6b6177d71f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 00:09:02 crc kubenswrapper[4870]: &Event{ObjectMeta:{kube-controller-manager-crc.189bef6b6177d71f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 00:09:02 crc kubenswrapper[4870]: body: Mar 12 00:09:02 crc kubenswrapper[4870]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:42.776205087 +0000 UTC m=+13.379621427,LastTimestamp:2026-03-12 00:09:02.776869962 +0000 UTC m=+33.380286302,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 00:09:02 crc kubenswrapper[4870]: > Mar 12 00:09:02 crc kubenswrapper[4870]: E0312 00:09:02.793552 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bef6b6179e107\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bef6b6179e107 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:42.776338695 +0000 UTC m=+13.379755035,LastTimestamp:2026-03-12 00:09:02.776957174 +0000 UTC m=+33.380373524,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:09:02 crc kubenswrapper[4870]: E0312 00:09:02.800613 4870 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bef7009d3893e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:09:02.780655934 +0000 UTC m=+33.384072304,LastTimestamp:2026-03-12 00:09:02.780655934 +0000 UTC m=+33.384072304,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:09:02 crc kubenswrapper[4870]: E0312 00:09:02.919837 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bef68b9bf1e86\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bef68b9bf1e86 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:31.372369542 +0000 UTC m=+1.975785862,LastTimestamp:2026-03-12 00:09:02.911988516 +0000 UTC m=+33.515404856,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.045339 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.104096 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.105815 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.105849 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.105864 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.106671 4870 scope.go:117] "RemoveContainer" containerID="2b04762e12c6ef8998bc8d22e059408d6790569ecac2f3ad9caeefc796e729b8" Mar 12 00:09:03 crc kubenswrapper[4870]: E0312 00:09:03.157877 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bef68cc20b8c3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bef68cc20b8c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:31.680755907 +0000 UTC m=+2.284172227,LastTimestamp:2026-03-12 00:09:03.148472445 +0000 UTC m=+33.751888785,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:09:03 crc kubenswrapper[4870]: E0312 00:09:03.169141 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bef68cccbbbca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bef68cccbbbca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:31.691963338 +0000 UTC m=+2.295379688,LastTimestamp:2026-03-12 00:09:03.161693001 +0000 UTC m=+33.765109351,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.297736 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.298361 4870 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750" exitCode=255 Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.298435 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750"} Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.298624 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.298641 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d"} Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.299713 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.299770 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:03 crc kubenswrapper[4870]: I0312 00:09:03.299789 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.046324 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.304270 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.306110 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c0d40905d50a931c148515349da1b71721a79a2336ed111744848a486b40aa85"} Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.306234 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.306336 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.307780 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.307838 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.307851 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.307968 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.308014 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:04 crc kubenswrapper[4870]: I0312 00:09:04.308032 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.044961 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:05 crc kubenswrapper[4870]: E0312 00:09:05.158984 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.171327 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.173232 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.173287 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.173304 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.173344 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 00:09:05 crc kubenswrapper[4870]: E0312 00:09:05.181474 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.311456 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.312710 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.316934 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c0d40905d50a931c148515349da1b71721a79a2336ed111744848a486b40aa85" exitCode=255 Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.316993 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c0d40905d50a931c148515349da1b71721a79a2336ed111744848a486b40aa85"} Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.317044 4870 scope.go:117] "RemoveContainer" containerID="2b04762e12c6ef8998bc8d22e059408d6790569ecac2f3ad9caeefc796e729b8" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.317343 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.319352 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.319422 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.319445 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:05 crc kubenswrapper[4870]: I0312 00:09:05.320522 4870 scope.go:117] "RemoveContainer" containerID="c0d40905d50a931c148515349da1b71721a79a2336ed111744848a486b40aa85" Mar 12 00:09:05 crc kubenswrapper[4870]: E0312 00:09:05.320875 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 00:09:06 crc kubenswrapper[4870]: I0312 00:09:06.044988 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:06 crc kubenswrapper[4870]: I0312 00:09:06.325072 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.044521 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.132613 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.132880 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.134713 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.134773 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.134791 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.135678 4870 scope.go:117] "RemoveContainer" containerID="c0d40905d50a931c148515349da1b71721a79a2336ed111744848a486b40aa85" Mar 12 00:09:07 crc kubenswrapper[4870]: E0312 00:09:07.135970 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.323720 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.331941 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.333924 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.334128 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.334336 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:07 crc kubenswrapper[4870]: I0312 00:09:07.335452 4870 scope.go:117] "RemoveContainer" containerID="c0d40905d50a931c148515349da1b71721a79a2336ed111744848a486b40aa85" Mar 12 00:09:07 crc kubenswrapper[4870]: E0312 00:09:07.335889 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 00:09:08 crc kubenswrapper[4870]: I0312 00:09:08.045445 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:09 crc kubenswrapper[4870]: I0312 00:09:09.049602 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:09 crc kubenswrapper[4870]: W0312 00:09:09.737228 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 12 00:09:09 crc kubenswrapper[4870]: E0312 00:09:09.737308 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 12 00:09:09 crc kubenswrapper[4870]: I0312 00:09:09.776213 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:09:09 crc kubenswrapper[4870]: I0312 00:09:09.776386 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:09 crc kubenswrapper[4870]: I0312 00:09:09.777985 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:09 crc kubenswrapper[4870]: I0312 00:09:09.778051 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:09 crc kubenswrapper[4870]: I0312 00:09:09.778078 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:10 crc kubenswrapper[4870]: I0312 00:09:10.044452 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:10 crc kubenswrapper[4870]: E0312 00:09:10.194799 4870 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 00:09:11 crc kubenswrapper[4870]: I0312 00:09:11.044711 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:11 crc kubenswrapper[4870]: I0312 00:09:11.977607 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:09:11 crc kubenswrapper[4870]: I0312 00:09:11.978474 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:11 crc kubenswrapper[4870]: I0312 00:09:11.980304 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:11 crc kubenswrapper[4870]: I0312 00:09:11.980361 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:11 crc kubenswrapper[4870]: I0312 00:09:11.980384 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:12 crc kubenswrapper[4870]: I0312 00:09:12.044700 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:12 crc kubenswrapper[4870]: E0312 00:09:12.165556 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 00:09:12 crc kubenswrapper[4870]: I0312 00:09:12.181556 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:12 crc kubenswrapper[4870]: I0312 00:09:12.183553 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:12 crc kubenswrapper[4870]: I0312 00:09:12.183619 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:12 crc kubenswrapper[4870]: I0312 00:09:12.183645 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:12 crc kubenswrapper[4870]: I0312 00:09:12.183691 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 00:09:12 crc kubenswrapper[4870]: E0312 00:09:12.190538 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 00:09:12 crc kubenswrapper[4870]: I0312 00:09:12.776825 4870 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 00:09:12 crc kubenswrapper[4870]: I0312 00:09:12.777938 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 00:09:12 crc kubenswrapper[4870]: E0312 00:09:12.781943 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bef6b6177d71f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 00:09:12 crc kubenswrapper[4870]: &Event{ObjectMeta:{kube-controller-manager-crc.189bef6b6177d71f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 00:09:12 crc kubenswrapper[4870]: body: Mar 12 00:09:12 crc kubenswrapper[4870]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:42.776205087 +0000 UTC m=+13.379621427,LastTimestamp:2026-03-12 00:09:12.776903392 +0000 UTC m=+43.380319742,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 00:09:12 crc kubenswrapper[4870]: > Mar 12 00:09:12 crc kubenswrapper[4870]: E0312 00:09:12.786089 4870 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bef6b6179e107\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bef6b6179e107 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:08:42.776338695 +0000 UTC m=+13.379755035,LastTimestamp:2026-03-12 00:09:12.778102064 +0000 UTC m=+43.381518374,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:09:13 crc kubenswrapper[4870]: I0312 00:09:13.044229 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:14 crc kubenswrapper[4870]: I0312 00:09:14.046486 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:15 crc kubenswrapper[4870]: I0312 00:09:15.044464 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:16 crc kubenswrapper[4870]: I0312 00:09:16.044935 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:17 crc kubenswrapper[4870]: I0312 00:09:17.044056 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:18 crc kubenswrapper[4870]: I0312 00:09:18.045127 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:18 crc kubenswrapper[4870]: I0312 00:09:18.103920 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:18 crc kubenswrapper[4870]: I0312 00:09:18.105342 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:18 crc kubenswrapper[4870]: I0312 00:09:18.105405 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:18 crc kubenswrapper[4870]: I0312 00:09:18.105428 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:18 crc kubenswrapper[4870]: I0312 00:09:18.106498 4870 scope.go:117] "RemoveContainer" containerID="c0d40905d50a931c148515349da1b71721a79a2336ed111744848a486b40aa85" Mar 12 00:09:18 crc kubenswrapper[4870]: E0312 00:09:18.106864 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 00:09:18 crc kubenswrapper[4870]: W0312 00:09:18.320695 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 12 00:09:18 crc kubenswrapper[4870]: E0312 00:09:18.320813 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 12 00:09:18 crc kubenswrapper[4870]: W0312 00:09:18.403531 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 12 00:09:18 crc kubenswrapper[4870]: E0312 00:09:18.403617 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 00:09:18 crc kubenswrapper[4870]: W0312 00:09:18.719949 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:18 crc kubenswrapper[4870]: E0312 00:09:18.720035 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 00:09:19 crc kubenswrapper[4870]: I0312 00:09:19.046294 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:19 crc kubenswrapper[4870]: E0312 00:09:19.174177 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 00:09:19 crc kubenswrapper[4870]: I0312 00:09:19.191436 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:19 crc kubenswrapper[4870]: I0312 00:09:19.193015 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:19 crc kubenswrapper[4870]: I0312 00:09:19.193069 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:19 crc kubenswrapper[4870]: I0312 00:09:19.193096 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:19 crc kubenswrapper[4870]: I0312 00:09:19.193140 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 00:09:19 crc kubenswrapper[4870]: E0312 00:09:19.200635 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 00:09:20 crc kubenswrapper[4870]: I0312 00:09:20.043869 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:20 crc kubenswrapper[4870]: E0312 00:09:20.195008 4870 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 00:09:21 crc kubenswrapper[4870]: I0312 00:09:21.046452 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:21 crc kubenswrapper[4870]: I0312 00:09:21.639134 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:09:21 crc kubenswrapper[4870]: I0312 00:09:21.639398 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:21 crc kubenswrapper[4870]: I0312 00:09:21.641003 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:21 crc kubenswrapper[4870]: I0312 00:09:21.641139 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:21 crc kubenswrapper[4870]: I0312 00:09:21.641201 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:21 crc kubenswrapper[4870]: I0312 00:09:21.647825 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:09:22 crc kubenswrapper[4870]: I0312 00:09:22.046204 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:22 crc kubenswrapper[4870]: I0312 00:09:22.374558 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:22 crc kubenswrapper[4870]: I0312 00:09:22.375523 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:22 crc kubenswrapper[4870]: I0312 00:09:22.375817 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:22 crc kubenswrapper[4870]: I0312 00:09:22.375913 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:23 crc kubenswrapper[4870]: I0312 00:09:23.045286 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:24 crc kubenswrapper[4870]: I0312 00:09:24.045700 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:25 crc kubenswrapper[4870]: I0312 00:09:25.079923 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:25 crc kubenswrapper[4870]: I0312 00:09:25.542678 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 00:09:25 crc kubenswrapper[4870]: I0312 00:09:25.542888 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:25 crc kubenswrapper[4870]: I0312 00:09:25.544019 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:25 crc kubenswrapper[4870]: I0312 00:09:25.544059 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:25 crc kubenswrapper[4870]: I0312 00:09:25.544093 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:26 crc kubenswrapper[4870]: I0312 00:09:26.042896 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:26 crc kubenswrapper[4870]: E0312 00:09:26.180700 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 00:09:26 crc kubenswrapper[4870]: I0312 00:09:26.200961 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:26 crc kubenswrapper[4870]: I0312 00:09:26.202071 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:26 crc kubenswrapper[4870]: I0312 00:09:26.202113 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:26 crc kubenswrapper[4870]: I0312 00:09:26.202127 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:26 crc kubenswrapper[4870]: I0312 00:09:26.202173 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 00:09:26 crc kubenswrapper[4870]: E0312 00:09:26.207345 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 00:09:27 crc kubenswrapper[4870]: I0312 00:09:27.044339 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:28 crc kubenswrapper[4870]: I0312 00:09:28.045347 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:29 crc kubenswrapper[4870]: I0312 00:09:29.044757 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.044905 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.104134 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.105963 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.106017 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.106028 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.106662 4870 scope.go:117] "RemoveContainer" containerID="c0d40905d50a931c148515349da1b71721a79a2336ed111744848a486b40aa85" Mar 12 00:09:30 crc kubenswrapper[4870]: E0312 00:09:30.195198 4870 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.394668 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.396288 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0"} Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.396438 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.397201 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.397230 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:30 crc kubenswrapper[4870]: I0312 00:09:30.397239 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:31 crc kubenswrapper[4870]: I0312 00:09:31.046542 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.044694 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.406411 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.407408 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.410567 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0" exitCode=255 Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.410619 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0"} Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.410681 4870 scope.go:117] "RemoveContainer" containerID="c0d40905d50a931c148515349da1b71721a79a2336ed111744848a486b40aa85" Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.411383 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.413516 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.413560 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.413580 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:32 crc kubenswrapper[4870]: I0312 00:09:32.414509 4870 scope.go:117] "RemoveContainer" containerID="87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0" Mar 12 00:09:32 crc kubenswrapper[4870]: E0312 00:09:32.414814 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 00:09:33 crc kubenswrapper[4870]: I0312 00:09:33.044415 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:33 crc kubenswrapper[4870]: E0312 00:09:33.188531 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 00:09:33 crc kubenswrapper[4870]: I0312 00:09:33.207752 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:33 crc kubenswrapper[4870]: I0312 00:09:33.209692 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:33 crc kubenswrapper[4870]: I0312 00:09:33.209749 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:33 crc kubenswrapper[4870]: I0312 00:09:33.209766 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:33 crc kubenswrapper[4870]: I0312 00:09:33.209799 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 00:09:33 crc kubenswrapper[4870]: E0312 00:09:33.213193 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 00:09:33 crc kubenswrapper[4870]: I0312 00:09:33.415456 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 00:09:34 crc kubenswrapper[4870]: I0312 00:09:34.045016 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 00:09:35 crc kubenswrapper[4870]: I0312 00:09:35.019237 4870 csr.go:261] certificate signing request csr-67p6c is approved, waiting to be issued Mar 12 00:09:35 crc kubenswrapper[4870]: I0312 00:09:35.028368 4870 csr.go:257] certificate signing request csr-67p6c is issued Mar 12 00:09:35 crc kubenswrapper[4870]: I0312 00:09:35.047920 4870 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 12 00:09:35 crc kubenswrapper[4870]: I0312 00:09:35.889976 4870 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 12 00:09:36 crc kubenswrapper[4870]: I0312 00:09:36.030405 4870 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-11 07:51:17.136682325 +0000 UTC Mar 12 00:09:36 crc kubenswrapper[4870]: I0312 00:09:36.030472 4870 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7327h41m41.106215629s for next certificate rotation Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.132989 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.133251 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.134602 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.134746 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.134775 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.135870 4870 scope.go:117] "RemoveContainer" containerID="87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0" Mar 12 00:09:37 crc kubenswrapper[4870]: E0312 00:09:37.136212 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.323774 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.426534 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.427776 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.427837 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.427854 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:37 crc kubenswrapper[4870]: I0312 00:09:37.428751 4870 scope.go:117] "RemoveContainer" containerID="87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0" Mar 12 00:09:37 crc kubenswrapper[4870]: E0312 00:09:37.429053 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.195301 4870 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.214268 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.215650 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.215736 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.215755 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.215892 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.226022 4870 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.226384 4870 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.226421 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.230443 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.230469 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.230477 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.230489 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.230500 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:40Z","lastTransitionTime":"2026-03-12T00:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.253482 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.261225 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.261316 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.261334 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.261355 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.261372 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:40Z","lastTransitionTime":"2026-03-12T00:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.273731 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.281022 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.281072 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.281089 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.281109 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.281123 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:40Z","lastTransitionTime":"2026-03-12T00:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.291080 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.298111 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.298189 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.298202 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.298221 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:40 crc kubenswrapper[4870]: I0312 00:09:40.298235 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:40Z","lastTransitionTime":"2026-03-12T00:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.308539 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.308645 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.308670 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.409617 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.510210 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.611091 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.711751 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.812555 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:40 crc kubenswrapper[4870]: E0312 00:09:40.913648 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:41 crc kubenswrapper[4870]: E0312 00:09:41.014569 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:41 crc kubenswrapper[4870]: E0312 00:09:41.114712 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:41 crc kubenswrapper[4870]: E0312 00:09:41.215443 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:41 crc kubenswrapper[4870]: E0312 00:09:41.315620 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:41 crc kubenswrapper[4870]: E0312 00:09:41.416441 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:41 crc kubenswrapper[4870]: E0312 00:09:41.516567 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:41 crc kubenswrapper[4870]: E0312 00:09:41.617172 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:41 crc kubenswrapper[4870]: E0312 00:09:41.717760 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:41 crc kubenswrapper[4870]: E0312 00:09:41.818411 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:41 crc kubenswrapper[4870]: E0312 00:09:41.919437 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:42 crc kubenswrapper[4870]: E0312 00:09:42.020088 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:42 crc kubenswrapper[4870]: E0312 00:09:42.121234 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:42 crc kubenswrapper[4870]: E0312 00:09:42.221752 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:42 crc kubenswrapper[4870]: E0312 00:09:42.322930 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:42 crc kubenswrapper[4870]: E0312 00:09:42.423550 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:42 crc kubenswrapper[4870]: E0312 00:09:42.524393 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:42 crc kubenswrapper[4870]: E0312 00:09:42.625134 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:42 crc kubenswrapper[4870]: E0312 00:09:42.726273 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:42 crc kubenswrapper[4870]: E0312 00:09:42.826833 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:42 crc kubenswrapper[4870]: E0312 00:09:42.927083 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:43 crc kubenswrapper[4870]: E0312 00:09:43.027520 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:43 crc kubenswrapper[4870]: E0312 00:09:43.128224 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:43 crc kubenswrapper[4870]: E0312 00:09:43.228707 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:43 crc kubenswrapper[4870]: E0312 00:09:43.328837 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:43 crc kubenswrapper[4870]: E0312 00:09:43.429295 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:43 crc kubenswrapper[4870]: E0312 00:09:43.530392 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:43 crc kubenswrapper[4870]: E0312 00:09:43.630976 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:43 crc kubenswrapper[4870]: E0312 00:09:43.731316 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:43 crc kubenswrapper[4870]: E0312 00:09:43.831920 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:43 crc kubenswrapper[4870]: E0312 00:09:43.932861 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:44 crc kubenswrapper[4870]: E0312 00:09:44.033004 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:44 crc kubenswrapper[4870]: E0312 00:09:44.133259 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:44 crc kubenswrapper[4870]: E0312 00:09:44.234255 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:44 crc kubenswrapper[4870]: E0312 00:09:44.335524 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:44 crc kubenswrapper[4870]: E0312 00:09:44.436196 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:44 crc kubenswrapper[4870]: E0312 00:09:44.537344 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:44 crc kubenswrapper[4870]: E0312 00:09:44.637861 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:44 crc kubenswrapper[4870]: E0312 00:09:44.738243 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:44 crc kubenswrapper[4870]: E0312 00:09:44.838429 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:44 crc kubenswrapper[4870]: E0312 00:09:44.939206 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:45 crc kubenswrapper[4870]: E0312 00:09:45.039389 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:45 crc kubenswrapper[4870]: E0312 00:09:45.139762 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:45 crc kubenswrapper[4870]: E0312 00:09:45.240222 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:45 crc kubenswrapper[4870]: E0312 00:09:45.340361 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:45 crc kubenswrapper[4870]: E0312 00:09:45.440504 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:45 crc kubenswrapper[4870]: E0312 00:09:45.541759 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:45 crc kubenswrapper[4870]: E0312 00:09:45.642448 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:45 crc kubenswrapper[4870]: E0312 00:09:45.743484 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:45 crc kubenswrapper[4870]: E0312 00:09:45.844683 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:45 crc kubenswrapper[4870]: E0312 00:09:45.945208 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:46 crc kubenswrapper[4870]: E0312 00:09:46.045542 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:46 crc kubenswrapper[4870]: E0312 00:09:46.146005 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.226708 4870 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 00:09:46 crc kubenswrapper[4870]: E0312 00:09:46.246281 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:46 crc kubenswrapper[4870]: E0312 00:09:46.347396 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:46 crc kubenswrapper[4870]: E0312 00:09:46.448276 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:46 crc kubenswrapper[4870]: E0312 00:09:46.548750 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:46 crc kubenswrapper[4870]: E0312 00:09:46.649880 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:46 crc kubenswrapper[4870]: E0312 00:09:46.750988 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.784729 4870 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.854132 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.854207 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.854224 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.854265 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.854283 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:46Z","lastTransitionTime":"2026-03-12T00:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.958202 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.958647 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.959108 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.959578 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:46 crc kubenswrapper[4870]: I0312 00:09:46.959970 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:46Z","lastTransitionTime":"2026-03-12T00:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.063819 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.064302 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.064458 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.064596 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.064725 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:47Z","lastTransitionTime":"2026-03-12T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.082385 4870 apiserver.go:52] "Watching apiserver" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.091971 4870 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.093346 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xwrqb","openshift-machine-config-operator/machine-config-daemon-84dfr","openshift-multus/multus-8hngl","openshift-multus/network-metrics-daemon-xkrk6","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq","openshift-network-operator/iptables-alerter-4ln5h","openshift-dns/node-resolver-46q4m","openshift-image-registry/node-ca-bnt7c","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-node-identity/network-node-identity-vrzqb","openshift-multus/multus-additional-cni-plugins-7fbnk","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.094044 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.094337 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.094648 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.094975 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.095203 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.095699 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.095989 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.096311 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.096401 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.097324 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.097392 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-46q4m" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.097500 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.097495 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.098714 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.099886 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.100575 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.101130 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.101716 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.104620 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.104837 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.105084 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.105577 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.105634 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.105853 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.105857 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.106140 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.106583 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.107193 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.107269 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.107690 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.107810 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.107965 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.108297 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.108346 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.108719 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.110108 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.110245 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.110313 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.110607 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.110838 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.110900 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.110981 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.111029 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.111122 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.111273 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.110839 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.111446 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.111503 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.111510 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.111715 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.111857 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.112111 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.112247 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.112292 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.112538 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.134471 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.141852 4870 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.147841 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.169909 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.174086 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.174179 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.174207 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.174236 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.174259 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:47Z","lastTransitionTime":"2026-03-12T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.187842 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.208899 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.218100 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225185 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225259 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225290 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225316 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225349 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225377 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225400 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225425 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225457 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225485 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225506 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225527 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225548 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225571 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225594 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225617 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225640 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225674 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225700 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225723 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225750 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225850 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225875 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225897 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225919 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225934 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.225985 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226016 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226038 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226062 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226086 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226107 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226133 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226205 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226234 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226257 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226280 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226302 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226325 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226346 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226368 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226390 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226412 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226434 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226459 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226481 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226502 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226507 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226523 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226608 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226650 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226687 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226683 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226722 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226759 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226793 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226910 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226945 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226982 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227016 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227052 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227085 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227118 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227191 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227235 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227274 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227321 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227357 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227394 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227428 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227462 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227524 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227559 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227594 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227635 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227671 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227706 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227744 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227777 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227814 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227851 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227886 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227920 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227956 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227989 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228025 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228064 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228100 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228136 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228201 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228272 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228311 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228344 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228380 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228419 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228470 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228524 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228578 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228632 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228679 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228732 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228779 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228825 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228878 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228937 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.231393 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.231592 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.231649 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226901 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.231709 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.226976 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227004 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227324 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227342 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227740 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.227992 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228109 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228329 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.228465 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.229030 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.229033 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.229742 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.229847 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.229955 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.230569 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.231663 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.232130 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.232214 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.232208 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.232313 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.232407 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.232455 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.232814 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.232968 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.232987 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.233030 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.233049 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.233543 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.234073 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.234123 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.236021 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.236466 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.236660 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.237129 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.237228 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.237469 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.237476 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.237520 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.237937 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.238006 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.238183 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.238319 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.238433 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.231769 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239575 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239624 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239686 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239713 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239766 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239790 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239843 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239867 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239914 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239938 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.239958 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240011 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240035 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240268 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240370 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240396 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240447 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240475 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240525 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240549 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240570 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240619 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240642 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240691 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240717 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240737 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240787 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240810 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240858 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240883 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240932 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240962 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.240988 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241042 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241068 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241089 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241136 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241195 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241219 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241273 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241297 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241319 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241340 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241362 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241382 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241403 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241425 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241446 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241505 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241531 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241552 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241576 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241598 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241619 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241642 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241662 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241685 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241707 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241729 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241751 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241774 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241797 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241819 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241842 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241903 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241927 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241959 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.241983 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242004 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242024 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242046 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242081 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242112 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242170 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242205 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242241 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242263 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242287 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242310 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242332 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242357 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242377 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242402 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242425 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242446 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242474 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242503 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242777 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242866 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242900 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-node-log\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242928 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/988c0290-1e98-46c8-8253-a4718914b9ef-proxy-tls\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242959 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs9fp\" (UniqueName: \"kubernetes.io/projected/988c0290-1e98-46c8-8253-a4718914b9ef-kube-api-access-xs9fp\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.242990 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-os-release\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.243020 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-log-socket\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.243051 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.243244 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-cnibin\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.243281 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.243867 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.244721 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.244775 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.244810 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.245007 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.245171 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.245659 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.245668 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.245817 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.245885 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.246339 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.246838 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.246844 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.247427 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.247432 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.248862 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.248931 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.249277 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.249501 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.249709 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.249525 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.249865 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.249982 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.250545 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.250269 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.250262 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.250434 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.250534 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.250640 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.250691 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.250973 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.250983 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.251213 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.251311 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.251533 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.251767 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.251775 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.251878 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.251893 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.252078 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.252140 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.252503 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.252932 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.252962 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253014 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.252749 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253060 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253300 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253951 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253385 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253395 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.254102 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253401 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253498 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253765 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253935 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.253944 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.254073 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.254424 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.254207 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.255022 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.255068 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.255333 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.255413 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-systemd\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.255542 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:47.755498896 +0000 UTC m=+78.358915246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.257094 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.257691 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.257098 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-openvswitch\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.255630 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.254597 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.254590 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.254627 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.255831 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.255881 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.256024 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.256202 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.254560 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.256761 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.256783 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.257874 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:09:47.757847093 +0000 UTC m=+78.361263553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.258163 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/217db5b4-2e71-4611-8091-53f047a1b1e5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.258209 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-slash\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.258240 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-netns\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.258264 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-config\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.258286 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-socket-dir-parent\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.258313 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-run-multus-certs\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.258728 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-system-cni-dir\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.258759 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5dbda14f-f860-4f24-ab29-43678602f4e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259560 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259600 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/354ecab7-6a88-47ab-8645-233ac3a125a5-hosts-file\") pod \"node-resolver-46q4m\" (UID: \"354ecab7-6a88-47ab-8645-233ac3a125a5\") " pod="openshift-dns/node-resolver-46q4m" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259622 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-run-k8s-cni-cncf-io\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259645 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwtlf\" (UniqueName: \"kubernetes.io/projected/2ad1e98a-cb66-436d-8e5e-301724f70769-kube-api-access-zwtlf\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259669 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111-host\") pod \"node-ca-bnt7c\" (UID: \"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\") " pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259695 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-systemd-units\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259721 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-cni-dir\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259748 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-daemon-config\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259775 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259800 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259826 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-var-lib-openvswitch\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259849 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-netd\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259876 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxx2f\" (UniqueName: \"kubernetes.io/projected/0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111-kube-api-access-fxx2f\") pod \"node-ca-bnt7c\" (UID: \"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\") " pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.259986 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260014 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111-serviceca\") pod \"node-ca-bnt7c\" (UID: \"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\") " pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260042 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260067 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/217db5b4-2e71-4611-8091-53f047a1b1e5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260089 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-bin\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260166 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260192 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-script-lib\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260222 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260249 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45w79\" (UniqueName: \"kubernetes.io/projected/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-kube-api-access-45w79\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260505 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/988c0290-1e98-46c8-8253-a4718914b9ef-mcd-auth-proxy-config\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260699 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/217db5b4-2e71-4611-8091-53f047a1b1e5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260761 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-etc-openvswitch\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260803 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-run-netns\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260833 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-hostroot\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260855 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5dbda14f-f860-4f24-ab29-43678602f4e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260905 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.260989 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr49h\" (UniqueName: \"kubernetes.io/projected/467385e2-3bbf-4cf0-909a-8e878b5d86dc-kube-api-access-hr49h\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.261021 4870 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.261055 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-var-lib-cni-multus\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.261750 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.262316 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.262419 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqh6s\" (UniqueName: \"kubernetes.io/projected/5dbda14f-f860-4f24-ab29-43678602f4e3-kube-api-access-gqh6s\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.262613 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovn-node-metrics-cert\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.262690 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.262754 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-var-lib-cni-bin\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.262794 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.262859 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.262902 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bn5d\" (UniqueName: \"kubernetes.io/projected/217db5b4-2e71-4611-8091-53f047a1b1e5-kube-api-access-7bn5d\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.262968 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bblc\" (UniqueName: \"kubernetes.io/projected/354ecab7-6a88-47ab-8645-233ac3a125a5-kube-api-access-5bblc\") pod \"node-resolver-46q4m\" (UID: \"354ecab7-6a88-47ab-8645-233ac3a125a5\") " pod="openshift-dns/node-resolver-46q4m" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.263041 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-ovn\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.263072 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-kubelet\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.263127 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-os-release\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.263350 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2ad1e98a-cb66-436d-8e5e-301724f70769-cni-binary-copy\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.263538 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.263601 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:47.763582897 +0000 UTC m=+78.366999418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.263602 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-system-cni-dir\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.263712 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-var-lib-kubelet\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.263923 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.264683 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-conf-dir\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.264740 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-etc-kubernetes\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.264787 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.264884 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-env-overrides\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265007 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-cnibin\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265098 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/988c0290-1e98-46c8-8253-a4718914b9ef-rootfs\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265372 4870 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265437 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265461 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265481 4870 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265540 4870 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265561 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265623 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265650 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265669 4870 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266357 4870 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266437 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.265752 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266457 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266518 4870 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266539 4870 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266746 4870 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266773 4870 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266847 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266867 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266922 4870 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266943 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.266962 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267020 4870 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267040 4870 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267058 4870 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267108 4870 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267128 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267202 4870 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267223 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267242 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267302 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267323 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267381 4870 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267405 4870 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267390 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267424 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267538 4870 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267552 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267563 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267572 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267583 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267594 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267605 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267615 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267627 4870 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267637 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267647 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267659 4870 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267680 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267697 4870 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267710 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267722 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267735 4870 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267748 4870 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267761 4870 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267771 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267780 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267789 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267798 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267807 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267816 4870 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267827 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267838 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267849 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267859 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267871 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267884 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267896 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267905 4870 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267914 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267922 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267930 4870 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267939 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267950 4870 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267961 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267972 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267983 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.267993 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268002 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268013 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268024 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268034 4870 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268044 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268055 4870 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268065 4870 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268073 4870 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268082 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268090 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268099 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268108 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268117 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268125 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268134 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268168 4870 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268177 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268186 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268195 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268203 4870 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268213 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268223 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268233 4870 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268242 4870 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268251 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268261 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268272 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268281 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268291 4870 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268301 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268311 4870 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268320 4870 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268330 4870 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268339 4870 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268349 4870 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268359 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268369 4870 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268378 4870 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268389 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268399 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268408 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.268418 4870 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.278813 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.278873 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.279163 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.279185 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.279201 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.279196 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.279201 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.279253 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.279269 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:47.779247846 +0000 UTC m=+78.382664346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.279254 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.279595 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.279760 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.279856 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280104 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280170 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280187 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280243 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280260 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280340 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280342 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280577 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280610 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280659 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.280742 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.280772 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.280794 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.280887 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:47.780840001 +0000 UTC m=+78.384256351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.280981 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.281805 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.281931 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.282136 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.282409 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.282456 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.282539 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.283793 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.283872 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.284291 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.284384 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.284736 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.285166 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.288200 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.288308 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.288330 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.288396 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.288429 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:47Z","lastTransitionTime":"2026-03-12T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.289354 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.290436 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.291123 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.291576 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.291848 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.292730 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.292794 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.293078 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.293167 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.293286 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.293896 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.294023 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.294536 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.294940 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.295208 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.295243 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.296032 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.296069 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.296531 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.296576 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.296532 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.296933 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.297010 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.297263 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.297773 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.297784 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.298401 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.298455 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.298634 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.298692 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.298872 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.299315 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.299345 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.300345 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.300414 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.300462 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.301355 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.301564 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.302331 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.302957 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.303104 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.303650 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.303696 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.303772 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.303948 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.304071 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.304126 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.313648 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.324688 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.326299 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.332634 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.333626 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.344669 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.344769 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.353496 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.369248 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxx2f\" (UniqueName: \"kubernetes.io/projected/0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111-kube-api-access-fxx2f\") pod \"node-ca-bnt7c\" (UID: \"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\") " pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.369577 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111-serviceca\") pod \"node-ca-bnt7c\" (UID: \"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\") " pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.369674 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.369763 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-script-lib\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.369837 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/217db5b4-2e71-4611-8091-53f047a1b1e5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.369792 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.369903 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-bin\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370024 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-bin\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370047 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-run-netns\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370191 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-hostroot\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370276 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45w79\" (UniqueName: \"kubernetes.io/projected/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-kube-api-access-45w79\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370360 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/988c0290-1e98-46c8-8253-a4718914b9ef-mcd-auth-proxy-config\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370443 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-script-lib\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370361 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/217db5b4-2e71-4611-8091-53f047a1b1e5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370266 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-hostroot\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370105 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-run-netns\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370687 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/217db5b4-2e71-4611-8091-53f047a1b1e5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370787 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-etc-openvswitch\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370855 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/988c0290-1e98-46c8-8253-a4718914b9ef-mcd-auth-proxy-config\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370900 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5dbda14f-f860-4f24-ab29-43678602f4e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370916 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-etc-openvswitch\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370948 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqh6s\" (UniqueName: \"kubernetes.io/projected/5dbda14f-f860-4f24-ab29-43678602f4e3-kube-api-access-gqh6s\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370970 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovn-node-metrics-cert\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.370985 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr49h\" (UniqueName: \"kubernetes.io/projected/467385e2-3bbf-4cf0-909a-8e878b5d86dc-kube-api-access-hr49h\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371004 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-var-lib-cni-multus\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371035 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-var-lib-cni-bin\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371053 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bn5d\" (UniqueName: \"kubernetes.io/projected/217db5b4-2e71-4611-8091-53f047a1b1e5-kube-api-access-7bn5d\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371061 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111-serviceca\") pod \"node-ca-bnt7c\" (UID: \"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\") " pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371092 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-ovn\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371069 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-ovn\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371123 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bblc\" (UniqueName: \"kubernetes.io/projected/354ecab7-6a88-47ab-8645-233ac3a125a5-kube-api-access-5bblc\") pod \"node-resolver-46q4m\" (UID: \"354ecab7-6a88-47ab-8645-233ac3a125a5\") " pod="openshift-dns/node-resolver-46q4m" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371131 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-var-lib-cni-multus\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371139 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-os-release\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371199 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-os-release\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371204 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2ad1e98a-cb66-436d-8e5e-301724f70769-cni-binary-copy\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371225 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-var-lib-cni-bin\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371241 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-kubelet\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371282 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-env-overrides\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371309 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-system-cni-dir\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371333 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-var-lib-kubelet\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371359 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-conf-dir\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371386 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-etc-kubernetes\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371414 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/988c0290-1e98-46c8-8253-a4718914b9ef-rootfs\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371446 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-cnibin\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371476 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-os-release\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371508 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371544 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-node-log\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371575 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/988c0290-1e98-46c8-8253-a4718914b9ef-proxy-tls\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371605 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs9fp\" (UniqueName: \"kubernetes.io/projected/988c0290-1e98-46c8-8253-a4718914b9ef-kube-api-access-xs9fp\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371637 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-systemd\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371668 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-openvswitch\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371697 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-log-socket\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371731 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371760 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-cnibin\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371775 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-env-overrides\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371791 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2ad1e98a-cb66-436d-8e5e-301724f70769-cni-binary-copy\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371824 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-node-log\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371792 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371853 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-var-lib-kubelet\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371871 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/217db5b4-2e71-4611-8091-53f047a1b1e5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.371879 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371907 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/988c0290-1e98-46c8-8253-a4718914b9ef-rootfs\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371891 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-etc-kubernetes\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371910 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-system-cni-dir\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371910 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-system-cni-dir\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.371942 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs podName:5c62c8d9-0f6b-4ec4-af08-fae75fb41288 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:47.871924708 +0000 UTC m=+78.475341098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs") pod "network-metrics-daemon-xkrk6" (UID: "5c62c8d9-0f6b-4ec4-af08-fae75fb41288") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372033 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-cnibin\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372086 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-log-socket\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372274 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-systemd\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372325 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-openvswitch\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371875 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-conf-dir\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371241 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/217db5b4-2e71-4611-8091-53f047a1b1e5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.371890 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-system-cni-dir\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372376 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372377 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5dbda14f-f860-4f24-ab29-43678602f4e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372395 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-cnibin\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372425 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-slash\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372447 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-netns\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372497 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-config\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372521 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-socket-dir-parent\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372541 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-run-multus-certs\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372586 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/354ecab7-6a88-47ab-8645-233ac3a125a5-hosts-file\") pod \"node-resolver-46q4m\" (UID: \"354ecab7-6a88-47ab-8645-233ac3a125a5\") " pod="openshift-dns/node-resolver-46q4m" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372607 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111-host\") pod \"node-ca-bnt7c\" (UID: \"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\") " pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372625 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-systemd-units\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372669 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-run-k8s-cni-cncf-io\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372690 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwtlf\" (UniqueName: \"kubernetes.io/projected/2ad1e98a-cb66-436d-8e5e-301724f70769-kube-api-access-zwtlf\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372779 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-var-lib-openvswitch\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.372967 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-netd\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373078 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373058 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-cni-dir\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373087 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5dbda14f-f860-4f24-ab29-43678602f4e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373123 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-daemon-config\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373176 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111-host\") pod \"node-ca-bnt7c\" (UID: \"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\") " pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373182 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/354ecab7-6a88-47ab-8645-233ac3a125a5-hosts-file\") pod \"node-resolver-46q4m\" (UID: \"354ecab7-6a88-47ab-8645-233ac3a125a5\") " pod="openshift-dns/node-resolver-46q4m" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373183 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373252 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373270 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-run-k8s-cni-cncf-io\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373317 4870 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373333 4870 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373346 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373359 4870 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373370 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373381 4870 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373393 4870 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373405 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373417 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373430 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373444 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373457 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373469 4870 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373480 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373490 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373503 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373514 4870 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373524 4870 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373536 4870 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373547 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373557 4870 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373556 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-var-lib-openvswitch\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373569 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373558 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5dbda14f-f860-4f24-ab29-43678602f4e3-os-release\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373582 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373594 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373610 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373624 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373607 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-netd\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373641 4870 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373653 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373664 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373675 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373681 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-socket-dir-parent\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373687 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373211 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373739 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373744 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-daemon-config\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373740 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-slash\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373217 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-systemd-units\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373761 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-netns\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.373857 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-host-run-multus-certs\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374059 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374077 4870 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374089 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374102 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374115 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374126 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374156 4870 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374170 4870 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374181 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374193 4870 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374204 4870 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374216 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374228 4870 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374239 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374250 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374262 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374273 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374284 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374295 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374307 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374319 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374333 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374346 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374356 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374368 4870 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374380 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374394 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374404 4870 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374415 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374426 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374435 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374445 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374457 4870 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374467 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374477 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374486 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374496 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374505 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374515 4870 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374526 4870 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374537 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374547 4870 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374826 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ad1e98a-cb66-436d-8e5e-301724f70769-multus-cni-dir\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.374557 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.375220 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.375232 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.377485 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-kubelet\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.384121 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5dbda14f-f860-4f24-ab29-43678602f4e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.385537 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-config\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.389068 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovn-node-metrics-cert\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.389493 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/217db5b4-2e71-4611-8091-53f047a1b1e5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.390853 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/988c0290-1e98-46c8-8253-a4718914b9ef-proxy-tls\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.392744 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.392780 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.392793 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.392809 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.392822 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:47Z","lastTransitionTime":"2026-03-12T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.394120 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bn5d\" (UniqueName: \"kubernetes.io/projected/217db5b4-2e71-4611-8091-53f047a1b1e5-kube-api-access-7bn5d\") pod \"ovnkube-control-plane-749d76644c-wrxrq\" (UID: \"217db5b4-2e71-4611-8091-53f047a1b1e5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.394710 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bblc\" (UniqueName: \"kubernetes.io/projected/354ecab7-6a88-47ab-8645-233ac3a125a5-kube-api-access-5bblc\") pod \"node-resolver-46q4m\" (UID: \"354ecab7-6a88-47ab-8645-233ac3a125a5\") " pod="openshift-dns/node-resolver-46q4m" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.395465 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45w79\" (UniqueName: \"kubernetes.io/projected/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-kube-api-access-45w79\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.395677 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxx2f\" (UniqueName: \"kubernetes.io/projected/0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111-kube-api-access-fxx2f\") pod \"node-ca-bnt7c\" (UID: \"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\") " pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.396677 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqh6s\" (UniqueName: \"kubernetes.io/projected/5dbda14f-f860-4f24-ab29-43678602f4e3-kube-api-access-gqh6s\") pod \"multus-additional-cni-plugins-7fbnk\" (UID: \"5dbda14f-f860-4f24-ab29-43678602f4e3\") " pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.397425 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs9fp\" (UniqueName: \"kubernetes.io/projected/988c0290-1e98-46c8-8253-a4718914b9ef-kube-api-access-xs9fp\") pod \"machine-config-daemon-84dfr\" (UID: \"988c0290-1e98-46c8-8253-a4718914b9ef\") " pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.397548 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwtlf\" (UniqueName: \"kubernetes.io/projected/2ad1e98a-cb66-436d-8e5e-301724f70769-kube-api-access-zwtlf\") pod \"multus-8hngl\" (UID: \"2ad1e98a-cb66-436d-8e5e-301724f70769\") " pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.399285 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr49h\" (UniqueName: \"kubernetes.io/projected/467385e2-3bbf-4cf0-909a-8e878b5d86dc-kube-api-access-hr49h\") pod \"ovnkube-node-xwrqb\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.429866 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.446806 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.452402 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"499ab7590905e705ff051263406037b8447c4cc7b3564d8712a95ea3436d0281"} Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.461496 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 00:09:47 crc kubenswrapper[4870]: W0312 00:09:47.472738 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-3df70d038b9a0f00b73991ed7cd7fbba94dec75ebb133851d9a95ca60f72ca39 WatchSource:0}: Error finding container 3df70d038b9a0f00b73991ed7cd7fbba94dec75ebb133851d9a95ca60f72ca39: Status 404 returned error can't find the container with id 3df70d038b9a0f00b73991ed7cd7fbba94dec75ebb133851d9a95ca60f72ca39 Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.478509 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" Mar 12 00:09:47 crc kubenswrapper[4870]: W0312 00:09:47.486779 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-aefa3e43e06a8a9a41cd19f7f7a83b62d58323db5eab283d5182ffc738a3465e WatchSource:0}: Error finding container aefa3e43e06a8a9a41cd19f7f7a83b62d58323db5eab283d5182ffc738a3465e: Status 404 returned error can't find the container with id aefa3e43e06a8a9a41cd19f7f7a83b62d58323db5eab283d5182ffc738a3465e Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.494551 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-46q4m" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.496289 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.496322 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.496332 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.496349 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.496362 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:47Z","lastTransitionTime":"2026-03-12T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.509011 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bnt7c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.522997 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" Mar 12 00:09:47 crc kubenswrapper[4870]: W0312 00:09:47.535866 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354ecab7_6a88_47ab_8645_233ac3a125a5.slice/crio-b491625824b5220971811d9e9405265b2483ceaa60aa13bd528ed1856e651015 WatchSource:0}: Error finding container b491625824b5220971811d9e9405265b2483ceaa60aa13bd528ed1856e651015: Status 404 returned error can't find the container with id b491625824b5220971811d9e9405265b2483ceaa60aa13bd528ed1856e651015 Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.549631 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:47 crc kubenswrapper[4870]: W0312 00:09:47.584266 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod467385e2_3bbf_4cf0_909a_8e878b5d86dc.slice/crio-eeb75b161617173fc75c006eb224489e05a0f4b000f626bfea176014320c34c1 WatchSource:0}: Error finding container eeb75b161617173fc75c006eb224489e05a0f4b000f626bfea176014320c34c1: Status 404 returned error can't find the container with id eeb75b161617173fc75c006eb224489e05a0f4b000f626bfea176014320c34c1 Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.591285 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8hngl" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.593589 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.600398 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.600443 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.600455 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.600474 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.600498 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:47Z","lastTransitionTime":"2026-03-12T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:47 crc kubenswrapper[4870]: W0312 00:09:47.657894 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod988c0290_1e98_46c8_8253_a4718914b9ef.slice/crio-8782c477ce2a126d588b94d809c59e0946a62a82b6269d1f7a125e3ca1767e8f WatchSource:0}: Error finding container 8782c477ce2a126d588b94d809c59e0946a62a82b6269d1f7a125e3ca1767e8f: Status 404 returned error can't find the container with id 8782c477ce2a126d588b94d809c59e0946a62a82b6269d1f7a125e3ca1767e8f Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.704788 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.704827 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.704864 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.704883 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.704895 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:47Z","lastTransitionTime":"2026-03-12T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.778293 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.778387 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.778451 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:09:48.77842348 +0000 UTC m=+79.381839790 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.778453 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.778495 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:48.778488682 +0000 UTC m=+79.381904992 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.778569 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.779199 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.779285 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:48.779270214 +0000 UTC m=+79.382686524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.806453 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.806487 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.806497 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.806510 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.806519 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:47Z","lastTransitionTime":"2026-03-12T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.879486 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.879527 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.879570 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.879625 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.879678 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs podName:5c62c8d9-0f6b-4ec4-af08-fae75fb41288 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:48.879655807 +0000 UTC m=+79.483072117 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs") pod "network-metrics-daemon-xkrk6" (UID: "5c62c8d9-0f6b-4ec4-af08-fae75fb41288") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.879686 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.879705 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.879712 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.879746 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.879763 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.879718 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.879845 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:48.879830612 +0000 UTC m=+79.483246912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:47 crc kubenswrapper[4870]: E0312 00:09:47.879983 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:48.879954056 +0000 UTC m=+79.483370376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.908930 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.908971 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.908980 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.908995 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:47 crc kubenswrapper[4870]: I0312 00:09:47.909008 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:47Z","lastTransitionTime":"2026-03-12T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.011487 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.011530 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.011541 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.011557 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.011568 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:48Z","lastTransitionTime":"2026-03-12T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.114481 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.115469 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.115906 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.115968 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.115987 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.116015 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.116035 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:48Z","lastTransitionTime":"2026-03-12T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.117047 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.118025 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.119467 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.120136 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.120949 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.122577 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.123964 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.125827 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.126398 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.127319 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.127869 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.128424 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.128930 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.129498 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.130277 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.131271 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.132612 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.135445 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.136779 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.139202 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.140345 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.142765 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.143732 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.145488 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.147640 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.148420 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.149973 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.150732 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.152201 4870 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.152362 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.154786 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.156317 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.156880 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.159019 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.159951 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.161240 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.162174 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.163836 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.164531 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.167789 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.168880 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.170658 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.171421 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.172717 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.173622 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.175395 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.177016 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.178425 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.180215 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.180909 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.181663 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.182680 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.218073 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.218179 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.218198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.218221 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.218238 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:48Z","lastTransitionTime":"2026-03-12T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.321014 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.321090 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.321109 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.321135 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.321179 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:48Z","lastTransitionTime":"2026-03-12T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.423515 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.423557 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.423572 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.423589 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.423600 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:48Z","lastTransitionTime":"2026-03-12T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.457733 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"aefa3e43e06a8a9a41cd19f7f7a83b62d58323db5eab283d5182ffc738a3465e"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.460125 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" event={"ID":"217db5b4-2e71-4611-8091-53f047a1b1e5","Type":"ContainerStarted","Data":"c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.460170 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" event={"ID":"217db5b4-2e71-4611-8091-53f047a1b1e5","Type":"ContainerStarted","Data":"48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.460180 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" event={"ID":"217db5b4-2e71-4611-8091-53f047a1b1e5","Type":"ContainerStarted","Data":"5f977c100d90e795a007f8b17d6ad463f6971d28d316a1a4af944686c6f5a1e4"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.464063 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.464096 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.464107 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3df70d038b9a0f00b73991ed7cd7fbba94dec75ebb133851d9a95ca60f72ca39"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.467603 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hngl" event={"ID":"2ad1e98a-cb66-436d-8e5e-301724f70769","Type":"ContainerStarted","Data":"1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.467631 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hngl" event={"ID":"2ad1e98a-cb66-436d-8e5e-301724f70769","Type":"ContainerStarted","Data":"7b4d3923c0d73b3d05cbc8ac76020eaa74cc3bb02d03bcbdc11bad1d57c39b1c"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.468977 4870 generic.go:334] "Generic (PLEG): container finished" podID="5dbda14f-f860-4f24-ab29-43678602f4e3" containerID="57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987" exitCode=0 Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.469053 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" event={"ID":"5dbda14f-f860-4f24-ab29-43678602f4e3","Type":"ContainerDied","Data":"57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.469079 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" event={"ID":"5dbda14f-f860-4f24-ab29-43678602f4e3","Type":"ContainerStarted","Data":"d776850ea0b6b305c8cc8380ff9f48193e1269a2383b818100f6af56ae0fda90"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.471956 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerStarted","Data":"98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.472003 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerStarted","Data":"9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.472026 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerStarted","Data":"8782c477ce2a126d588b94d809c59e0946a62a82b6269d1f7a125e3ca1767e8f"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.474801 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5" exitCode=0 Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.474932 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.475006 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"eeb75b161617173fc75c006eb224489e05a0f4b000f626bfea176014320c34c1"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.476650 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-46q4m" event={"ID":"354ecab7-6a88-47ab-8645-233ac3a125a5","Type":"ContainerStarted","Data":"475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.476715 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-46q4m" event={"ID":"354ecab7-6a88-47ab-8645-233ac3a125a5","Type":"ContainerStarted","Data":"b491625824b5220971811d9e9405265b2483ceaa60aa13bd528ed1856e651015"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.478946 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.479052 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.481503 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bnt7c" event={"ID":"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111","Type":"ContainerStarted","Data":"e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.481597 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bnt7c" event={"ID":"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111","Type":"ContainerStarted","Data":"72419cbde03467cc44fc03fff442bdb7fbe02121ab535850a15eb4278c7547bf"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.494476 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.515275 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.528107 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.528171 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.528184 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.528203 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.528217 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:48Z","lastTransitionTime":"2026-03-12T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.533279 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.549501 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.564792 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.578686 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.594058 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.614764 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.627194 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.640299 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.640448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.641519 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.641529 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.641546 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.641558 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:48Z","lastTransitionTime":"2026-03-12T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.661407 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.670718 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.681666 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.695879 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.708509 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.719292 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.732432 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.744249 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.744299 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.744313 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.744335 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.744348 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:48Z","lastTransitionTime":"2026-03-12T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.746227 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.768666 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.783266 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.790964 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.791133 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:09:50.791103619 +0000 UTC m=+81.394519929 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.791194 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.791265 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.791335 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.791400 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.791421 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:50.791400687 +0000 UTC m=+81.394816997 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.791439 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:50.791432448 +0000 UTC m=+81.394848758 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.796407 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.818547 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.831002 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.842602 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.849197 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.849717 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.849732 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.849752 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.849764 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:48Z","lastTransitionTime":"2026-03-12T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.856903 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.875112 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.890240 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.892525 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.892565 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.892595 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.892706 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.892768 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.892826 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.892839 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.892706 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.892865 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.892790 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs podName:5c62c8d9-0f6b-4ec4-af08-fae75fb41288 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:50.892770488 +0000 UTC m=+81.496186798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs") pod "network-metrics-daemon-xkrk6" (UID: "5c62c8d9-0f6b-4ec4-af08-fae75fb41288") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.892881 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.892993 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:50.892946703 +0000 UTC m=+81.496363023 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:48 crc kubenswrapper[4870]: E0312 00:09:48.893065 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:50.893050576 +0000 UTC m=+81.496466946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.952341 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.952390 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.952401 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.952416 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:48 crc kubenswrapper[4870]: I0312 00:09:48.952426 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:48Z","lastTransitionTime":"2026-03-12T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.054085 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.054136 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.054162 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.054181 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.054193 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:49Z","lastTransitionTime":"2026-03-12T00:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.104216 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.104251 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.104306 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.104215 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:49 crc kubenswrapper[4870]: E0312 00:09:49.104337 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:09:49 crc kubenswrapper[4870]: E0312 00:09:49.104393 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:09:49 crc kubenswrapper[4870]: E0312 00:09:49.104464 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:09:49 crc kubenswrapper[4870]: E0312 00:09:49.104536 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.157188 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.157267 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.157293 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.157321 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.157338 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:49Z","lastTransitionTime":"2026-03-12T00:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.260752 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.260809 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.260827 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.260858 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.260877 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:49Z","lastTransitionTime":"2026-03-12T00:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.365849 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.365886 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.365896 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.365911 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.365922 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:49Z","lastTransitionTime":"2026-03-12T00:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.469082 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.469120 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.469130 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.469158 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.469166 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:49Z","lastTransitionTime":"2026-03-12T00:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.487905 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.487941 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.487950 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.487959 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.487968 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.487977 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.489513 4870 generic.go:334] "Generic (PLEG): container finished" podID="5dbda14f-f860-4f24-ab29-43678602f4e3" containerID="94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75" exitCode=0 Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.490007 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" event={"ID":"5dbda14f-f860-4f24-ab29-43678602f4e3","Type":"ContainerDied","Data":"94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.503523 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.519711 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.531343 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.545218 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.554799 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.566809 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.571396 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.571438 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.571453 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.571467 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.571477 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:49Z","lastTransitionTime":"2026-03-12T00:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.583837 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.594415 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.603724 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.615139 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.632351 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.649685 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.668985 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.674186 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.674246 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.674258 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.674291 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.674302 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:49Z","lastTransitionTime":"2026-03-12T00:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.684136 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:49Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.776564 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.776600 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.776610 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.776626 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.776640 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:49Z","lastTransitionTime":"2026-03-12T00:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.879757 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.879986 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.879999 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.880181 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.880385 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:49Z","lastTransitionTime":"2026-03-12T00:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.984582 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.984645 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.984663 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.984685 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:49 crc kubenswrapper[4870]: I0312 00:09:49.984702 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:49Z","lastTransitionTime":"2026-03-12T00:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.088345 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.088393 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.088404 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.088421 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.088432 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.125172 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.136873 4870 scope.go:117] "RemoveContainer" containerID="87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.136940 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.137211 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.145879 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.162114 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.180571 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.190519 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.190613 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.190630 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.190650 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.190664 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.206396 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.223756 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.237075 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.247804 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.265694 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.280912 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.293350 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.293387 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.293397 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.293411 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.293420 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.297671 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.312118 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.330464 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.346745 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.367474 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.368365 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.368496 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.368619 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.368733 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.386942 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.392427 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.392562 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.392662 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.392765 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.392860 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.413441 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.418098 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.418319 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.418416 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.418507 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.418596 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.436400 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.440903 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.440984 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.441001 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.441019 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.441032 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.461880 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.466423 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.466476 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.466493 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.466515 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.466531 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.485483 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.485635 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.488134 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.488206 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.488224 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.488248 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.488263 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.494811 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.500082 4870 generic.go:334] "Generic (PLEG): container finished" podID="5dbda14f-f860-4f24-ab29-43678602f4e3" containerID="53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3" exitCode=0 Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.500211 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" event={"ID":"5dbda14f-f860-4f24-ab29-43678602f4e3","Type":"ContainerDied","Data":"53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.501029 4870 scope.go:117] "RemoveContainer" containerID="87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0" Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.501367 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.513369 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.537409 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.563881 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.585617 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.591239 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.591300 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.591318 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.591344 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.591361 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.603448 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.620053 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.651221 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.666634 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.681508 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.694383 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.694423 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.694437 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.694458 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.694474 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.694518 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.708818 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.720032 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.733493 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.743038 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.752021 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.770870 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.782541 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.795257 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.797242 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.797299 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.797318 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.797341 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.797358 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.809247 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.817215 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.817379 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:09:54.817344779 +0000 UTC m=+85.420761139 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.817417 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.817480 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.817570 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.817632 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.817640 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:54.817602357 +0000 UTC m=+85.421018657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.817731 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:54.81771282 +0000 UTC m=+85.421129120 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.822245 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.835479 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.857835 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.869590 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.881755 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.898693 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.899671 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.899721 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.899740 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.899764 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.899781 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:50Z","lastTransitionTime":"2026-03-12T00:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.915378 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.918439 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.918493 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.918555 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.918729 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.918762 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.918781 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.918839 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:54.918818713 +0000 UTC m=+85.522235063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.919260 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.919316 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs podName:5c62c8d9-0f6b-4ec4-af08-fae75fb41288 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:54.919300847 +0000 UTC m=+85.522717187 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs") pod "network-metrics-daemon-xkrk6" (UID: "5c62c8d9-0f6b-4ec4-af08-fae75fb41288") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.919399 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.919422 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.919438 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:50 crc kubenswrapper[4870]: E0312 00:09:50.919476 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 00:09:54.919463942 +0000 UTC m=+85.522880292 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.931237 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.943622 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.958120 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:50 crc kubenswrapper[4870]: I0312 00:09:50.970516 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.002701 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.002756 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.002773 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.002797 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.002814 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:51Z","lastTransitionTime":"2026-03-12T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.103832 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.103864 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.103873 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.103919 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:51 crc kubenswrapper[4870]: E0312 00:09:51.103962 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:09:51 crc kubenswrapper[4870]: E0312 00:09:51.104165 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:09:51 crc kubenswrapper[4870]: E0312 00:09:51.104416 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:09:51 crc kubenswrapper[4870]: E0312 00:09:51.104499 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.106065 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.106095 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.106103 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.106116 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.106126 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:51Z","lastTransitionTime":"2026-03-12T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.209336 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.209401 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.209419 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.209445 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.209464 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:51Z","lastTransitionTime":"2026-03-12T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.312274 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.312304 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.312313 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.312326 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.312334 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:51Z","lastTransitionTime":"2026-03-12T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.417211 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.417248 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.417260 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.417274 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.417285 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:51Z","lastTransitionTime":"2026-03-12T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.508128 4870 generic.go:334] "Generic (PLEG): container finished" podID="5dbda14f-f860-4f24-ab29-43678602f4e3" containerID="f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1" exitCode=0 Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.508593 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" event={"ID":"5dbda14f-f860-4f24-ab29-43678602f4e3","Type":"ContainerDied","Data":"f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1"} Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.520341 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.520392 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.520409 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.520430 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.520447 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:51Z","lastTransitionTime":"2026-03-12T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.524408 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.548233 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.560202 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.584289 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.605553 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.622652 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.623355 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.623392 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.623407 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.623429 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.623444 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:51Z","lastTransitionTime":"2026-03-12T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.637508 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.656871 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.675761 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.688430 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.701615 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.717370 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.729023 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.729079 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.729092 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.729114 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.729132 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:51Z","lastTransitionTime":"2026-03-12T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.732208 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.752190 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.764245 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.831255 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.831290 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.831301 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.831318 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.831330 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:51Z","lastTransitionTime":"2026-03-12T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.934155 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.934205 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.934216 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.934254 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:51 crc kubenswrapper[4870]: I0312 00:09:51.934266 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:51Z","lastTransitionTime":"2026-03-12T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.036813 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.036863 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.036880 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.036902 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.036919 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:52Z","lastTransitionTime":"2026-03-12T00:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.112543 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.139236 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.139302 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.139326 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.139357 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.139381 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:52Z","lastTransitionTime":"2026-03-12T00:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.242573 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.242639 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.242663 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.242693 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.242716 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:52Z","lastTransitionTime":"2026-03-12T00:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.344607 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.344657 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.344671 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.344691 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.344705 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:52Z","lastTransitionTime":"2026-03-12T00:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.447274 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.447346 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.447365 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.447388 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.447406 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:52Z","lastTransitionTime":"2026-03-12T00:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.523662 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.527979 4870 generic.go:334] "Generic (PLEG): container finished" podID="5dbda14f-f860-4f24-ab29-43678602f4e3" containerID="fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2" exitCode=0 Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.528171 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" event={"ID":"5dbda14f-f860-4f24-ab29-43678602f4e3","Type":"ContainerDied","Data":"fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.555997 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.556101 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.556128 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.555853 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.556243 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.556315 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:52Z","lastTransitionTime":"2026-03-12T00:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.580303 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.602724 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.622564 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.640278 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.658846 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.658918 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.658936 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.658960 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.658979 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:52Z","lastTransitionTime":"2026-03-12T00:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.661046 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.678683 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.694482 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.714370 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.733597 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.752266 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.761583 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.761649 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.761668 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.761693 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.761710 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:52Z","lastTransitionTime":"2026-03-12T00:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.781671 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.796519 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.813908 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.825485 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.836454 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.863836 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.863873 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.863881 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.863894 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.863903 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:52Z","lastTransitionTime":"2026-03-12T00:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.967372 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.967445 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.967470 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.967499 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:52 crc kubenswrapper[4870]: I0312 00:09:52.967520 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:52Z","lastTransitionTime":"2026-03-12T00:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.070593 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.071002 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.071099 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.071226 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.071316 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:53Z","lastTransitionTime":"2026-03-12T00:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.103902 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:53 crc kubenswrapper[4870]: E0312 00:09:53.104397 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.103998 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:53 crc kubenswrapper[4870]: E0312 00:09:53.104591 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.103928 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:53 crc kubenswrapper[4870]: E0312 00:09:53.104755 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.104024 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:53 crc kubenswrapper[4870]: E0312 00:09:53.104961 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.177648 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.177717 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.177739 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.177763 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.177780 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:53Z","lastTransitionTime":"2026-03-12T00:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.280985 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.281514 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.281669 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.281822 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.281948 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:53Z","lastTransitionTime":"2026-03-12T00:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.386928 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.387440 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.387590 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.387734 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.387864 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:53Z","lastTransitionTime":"2026-03-12T00:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.491414 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.491464 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.491476 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.491492 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.491509 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:53Z","lastTransitionTime":"2026-03-12T00:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.537610 4870 generic.go:334] "Generic (PLEG): container finished" podID="5dbda14f-f860-4f24-ab29-43678602f4e3" containerID="65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14" exitCode=0 Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.537679 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" event={"ID":"5dbda14f-f860-4f24-ab29-43678602f4e3","Type":"ContainerDied","Data":"65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14"} Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.562384 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.586410 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.594904 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.594948 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.594976 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.594995 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.595008 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:53Z","lastTransitionTime":"2026-03-12T00:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.609112 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.631521 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.653961 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.671016 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.700486 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.703913 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.703958 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.703972 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.703991 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.704005 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:53Z","lastTransitionTime":"2026-03-12T00:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.745472 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.770216 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.789902 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.802751 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.807861 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.807899 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.807908 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.807924 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.807935 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:53Z","lastTransitionTime":"2026-03-12T00:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.817169 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.829361 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.842525 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.851253 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.860675 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:53Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.911492 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.911540 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.911550 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.911568 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:53 crc kubenswrapper[4870]: I0312 00:09:53.911580 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:53Z","lastTransitionTime":"2026-03-12T00:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.015008 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.015067 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.015093 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.015116 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.015132 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:54Z","lastTransitionTime":"2026-03-12T00:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.117857 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.117917 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.117936 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.117961 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.117978 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:54Z","lastTransitionTime":"2026-03-12T00:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.220825 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.220905 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.220937 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.220968 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.220989 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:54Z","lastTransitionTime":"2026-03-12T00:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.324488 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.324844 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.324979 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.325190 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.325346 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:54Z","lastTransitionTime":"2026-03-12T00:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.429079 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.429908 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.430057 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.430224 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.430393 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:54Z","lastTransitionTime":"2026-03-12T00:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.534133 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.534210 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.534227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.534253 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.534272 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:54Z","lastTransitionTime":"2026-03-12T00:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.549349 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.549969 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.550018 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.550039 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.561395 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" event={"ID":"5dbda14f-f860-4f24-ab29-43678602f4e3","Type":"ContainerStarted","Data":"b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.576919 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.589239 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.589325 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.596910 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.619794 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.637237 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.637442 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.637571 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.637707 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.637784 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.637994 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:54Z","lastTransitionTime":"2026-03-12T00:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.653649 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.676349 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.692204 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.712899 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.732437 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.740494 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.740570 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.740588 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.740614 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.740633 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:54Z","lastTransitionTime":"2026-03-12T00:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.752093 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.770754 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.802572 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.821465 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.843278 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.843342 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.843361 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.843385 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.843404 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:54Z","lastTransitionTime":"2026-03-12T00:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.847204 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.859952 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.864726 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.864882 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.864930 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.865023 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:10:02.864977364 +0000 UTC m=+93.468393694 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.865033 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.865033 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.869312 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:02.869112992 +0000 UTC m=+93.472529322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.869392 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:02.86936603 +0000 UTC m=+93.472782360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.878674 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.899874 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.913310 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.928083 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.938883 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.946842 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.946883 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.946900 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.946922 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.946939 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:54Z","lastTransitionTime":"2026-03-12T00:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.960703 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.966336 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.966412 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.966472 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.966562 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.966681 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.966726 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.966747 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.966686 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs podName:5c62c8d9-0f6b-4ec4-af08-fae75fb41288 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:02.966657664 +0000 UTC m=+93.570073994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs") pod "network-metrics-daemon-xkrk6" (UID: "5c62c8d9-0f6b-4ec4-af08-fae75fb41288") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.966684 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.966839 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:02.966813498 +0000 UTC m=+93.570229848 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.966848 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.966868 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:54 crc kubenswrapper[4870]: E0312 00:09:54.966921 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:02.966902701 +0000 UTC m=+93.570319021 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:09:54 crc kubenswrapper[4870]: I0312 00:09:54.983479 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.000400 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:54Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.015866 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:55Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.028522 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:55Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.041028 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:55Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.050969 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.051058 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.051076 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.051100 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.051117 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:55Z","lastTransitionTime":"2026-03-12T00:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.067077 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:55Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.081918 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:55Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.095867 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:55Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.104386 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:55 crc kubenswrapper[4870]: E0312 00:09:55.104559 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.104802 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.104944 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.105190 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:55 crc kubenswrapper[4870]: E0312 00:09:55.105175 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:09:55 crc kubenswrapper[4870]: E0312 00:09:55.105353 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:09:55 crc kubenswrapper[4870]: E0312 00:09:55.105513 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.113581 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:55Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.128294 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:55Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.146680 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:55Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.153508 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.153548 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.153557 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.153570 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.153581 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:55Z","lastTransitionTime":"2026-03-12T00:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.256212 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.256287 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.256308 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.256335 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.256357 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:55Z","lastTransitionTime":"2026-03-12T00:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.360033 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.360094 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.360112 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.360135 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.360187 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:55Z","lastTransitionTime":"2026-03-12T00:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.463630 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.463687 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.463704 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.463728 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.463745 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:55Z","lastTransitionTime":"2026-03-12T00:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.565763 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.565813 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.565825 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.565840 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.565852 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:55Z","lastTransitionTime":"2026-03-12T00:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.668606 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.668651 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.668661 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.668679 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.668696 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:55Z","lastTransitionTime":"2026-03-12T00:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.770615 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.770667 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.770683 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.770704 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.770723 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:55Z","lastTransitionTime":"2026-03-12T00:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.872944 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.872984 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.872992 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.873007 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.873016 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:55Z","lastTransitionTime":"2026-03-12T00:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.975139 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.975192 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.975200 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.975214 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:55 crc kubenswrapper[4870]: I0312 00:09:55.975223 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:55Z","lastTransitionTime":"2026-03-12T00:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.077646 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.077679 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.077688 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.077701 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.077709 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:56Z","lastTransitionTime":"2026-03-12T00:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.114922 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.180200 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.180231 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.180239 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.180252 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.180269 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:56Z","lastTransitionTime":"2026-03-12T00:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.283166 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.283216 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.283227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.283244 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.283256 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:56Z","lastTransitionTime":"2026-03-12T00:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.386172 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.386212 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.386230 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.386253 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.386277 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:56Z","lastTransitionTime":"2026-03-12T00:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.489095 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.489198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.489215 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.489237 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.489254 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:56Z","lastTransitionTime":"2026-03-12T00:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.593867 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.593904 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.593915 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.593931 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.593943 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:56Z","lastTransitionTime":"2026-03-12T00:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.695604 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.695645 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.695654 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.695667 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.695677 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:56Z","lastTransitionTime":"2026-03-12T00:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.799814 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.799914 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.799939 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.799969 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.799992 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:56Z","lastTransitionTime":"2026-03-12T00:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.902871 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.902927 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.902939 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.902957 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:56 crc kubenswrapper[4870]: I0312 00:09:56.902969 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:56Z","lastTransitionTime":"2026-03-12T00:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.006579 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.006648 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.006662 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.006684 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.006702 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:57Z","lastTransitionTime":"2026-03-12T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.104181 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.104186 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.104280 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.104365 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:57 crc kubenswrapper[4870]: E0312 00:09:57.104578 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:09:57 crc kubenswrapper[4870]: E0312 00:09:57.104768 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:09:57 crc kubenswrapper[4870]: E0312 00:09:57.104904 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:09:57 crc kubenswrapper[4870]: E0312 00:09:57.105114 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.110697 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.110748 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.110766 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.110789 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.110807 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:57Z","lastTransitionTime":"2026-03-12T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.213762 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.213815 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.213830 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.213847 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.213862 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:57Z","lastTransitionTime":"2026-03-12T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.317417 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.317467 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.317486 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.317509 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.317716 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:57Z","lastTransitionTime":"2026-03-12T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.420903 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.420950 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.420966 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.420988 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.421009 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:57Z","lastTransitionTime":"2026-03-12T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.524245 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.524307 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.524323 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.524342 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.524354 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:57Z","lastTransitionTime":"2026-03-12T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.573948 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/0.log" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.578061 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045" exitCode=1 Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.578121 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.579032 4870 scope.go:117] "RemoveContainer" containerID="7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.606569 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.627987 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.629474 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.629494 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.629504 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.629519 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.629530 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:57Z","lastTransitionTime":"2026-03-12T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.645065 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.661171 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.672607 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.684842 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.695037 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.706928 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.720303 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.732027 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.732062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.732075 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.732093 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.732106 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:57Z","lastTransitionTime":"2026-03-12T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.734107 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.746210 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.770075 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:09:56Z\\\",\\\"message\\\":\\\"ller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 00:09:56.466538 6498 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 00:09:56.466626 6498 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:09:56.468197 6498 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0312 00:09:56.468244 6498 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 00:09:56.468268 6498 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:09:56.468277 6498 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:09:56.468326 6498 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:09:56.468403 6498 factory.go:656] Stopping watch factory\\\\nI0312 00:09:56.468435 6498 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:09:56.466728 6498 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 00:09:56.468482 6498 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:09:56.468499 6498 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 00:09:56.468513 6498 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 00:09:56.468528 6498 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:09:56.468543 6498 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.783507 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.799490 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.813884 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.834789 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.834833 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.834849 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.834871 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.834891 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:57Z","lastTransitionTime":"2026-03-12T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.840215 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.851685 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:57Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.937375 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.937436 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.937450 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.937472 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:57 crc kubenswrapper[4870]: I0312 00:09:57.937487 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:57Z","lastTransitionTime":"2026-03-12T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.039793 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.039860 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.039874 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.039895 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.039912 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:58Z","lastTransitionTime":"2026-03-12T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.142631 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.142679 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.142693 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.142709 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.142723 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:58Z","lastTransitionTime":"2026-03-12T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.245063 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.245126 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.245161 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.245184 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.245202 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:58Z","lastTransitionTime":"2026-03-12T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.347771 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.347808 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.347818 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.347831 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.347840 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:58Z","lastTransitionTime":"2026-03-12T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.450981 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.451025 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.451037 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.451054 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.451066 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:58Z","lastTransitionTime":"2026-03-12T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.553666 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.553698 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.553707 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.553719 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.553727 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:58Z","lastTransitionTime":"2026-03-12T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.584404 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/0.log" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.587006 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.588236 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.602218 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.613479 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.628979 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.650415 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.656374 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.656445 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.656466 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.656492 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.656510 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:58Z","lastTransitionTime":"2026-03-12T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.665585 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.678377 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.692049 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.715569 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:09:56Z\\\",\\\"message\\\":\\\"ller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 00:09:56.466538 6498 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 00:09:56.466626 6498 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:09:56.468197 6498 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0312 00:09:56.468244 6498 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 00:09:56.468268 6498 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:09:56.468277 6498 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:09:56.468326 6498 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:09:56.468403 6498 factory.go:656] Stopping watch factory\\\\nI0312 00:09:56.468435 6498 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:09:56.466728 6498 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 00:09:56.468482 6498 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:09:56.468499 6498 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 00:09:56.468513 6498 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 00:09:56.468528 6498 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:09:56.468543 6498 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.727859 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.745549 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.757443 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.758603 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.758638 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.758650 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.758665 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.758676 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:58Z","lastTransitionTime":"2026-03-12T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.768044 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.779224 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.792340 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.802366 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.820031 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.829629 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.860783 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.860826 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.860835 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.860850 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.860860 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:58Z","lastTransitionTime":"2026-03-12T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.962859 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.962906 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.962917 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.962932 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:58 crc kubenswrapper[4870]: I0312 00:09:58.962945 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:58Z","lastTransitionTime":"2026-03-12T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.065879 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.065916 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.065928 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.065944 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.065955 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:59Z","lastTransitionTime":"2026-03-12T00:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.104502 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.104555 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.104645 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:09:59 crc kubenswrapper[4870]: E0312 00:09:59.104786 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.105073 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:09:59 crc kubenswrapper[4870]: E0312 00:09:59.105228 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:09:59 crc kubenswrapper[4870]: E0312 00:09:59.105279 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:09:59 crc kubenswrapper[4870]: E0312 00:09:59.105375 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.167866 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.167904 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.167917 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.167934 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.167947 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:59Z","lastTransitionTime":"2026-03-12T00:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.271248 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.271297 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.271310 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.271327 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.271341 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:59Z","lastTransitionTime":"2026-03-12T00:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.374997 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.375057 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.375074 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.375096 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.375109 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:59Z","lastTransitionTime":"2026-03-12T00:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.478468 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.478570 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.478588 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.478618 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.478642 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:59Z","lastTransitionTime":"2026-03-12T00:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.581830 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.581889 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.581911 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.581933 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.581951 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:59Z","lastTransitionTime":"2026-03-12T00:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.593562 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/1.log" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.594934 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/0.log" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.599553 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7" exitCode=1 Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.599634 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.599812 4870 scope.go:117] "RemoveContainer" containerID="7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.600509 4870 scope.go:117] "RemoveContainer" containerID="4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7" Mar 12 00:09:59 crc kubenswrapper[4870]: E0312 00:09:59.600746 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.627461 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.639624 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.648752 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.666858 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.679320 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.685272 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.685319 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.685335 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.685352 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.685363 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:59Z","lastTransitionTime":"2026-03-12T00:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.690239 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.707360 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.725759 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.745116 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.758369 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.777410 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.789266 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.789329 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.789350 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.789375 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.789391 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:59Z","lastTransitionTime":"2026-03-12T00:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.792031 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.816711 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:09:56Z\\\",\\\"message\\\":\\\"ller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 00:09:56.466538 6498 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 00:09:56.466626 6498 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:09:56.468197 6498 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0312 00:09:56.468244 6498 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 00:09:56.468268 6498 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:09:56.468277 6498 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:09:56.468326 6498 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:09:56.468403 6498 factory.go:656] Stopping watch factory\\\\nI0312 00:09:56.468435 6498 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:09:56.466728 6498 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 00:09:56.468482 6498 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:09:56.468499 6498 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 00:09:56.468513 6498 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 00:09:56.468528 6498 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:09:56.468543 6498 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:09:58Z\\\",\\\"message\\\":\\\"removal\\\\nI0312 00:09:58.808413 6846 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:09:58.808499 6846 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0312 00:09:58.808466 6846 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 00:09:58.808538 6846 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:09:58.808563 6846 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 00:09:58.816427 6846 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:09:58.816443 6846 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:09:58.816460 6846 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:09:58.816482 6846 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:09:58.816486 6846 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0312 00:09:58.816499 6846 factory.go:656] Stopping watch factory\\\\nI0312 00:09:58.816511 6846 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:09:58.816535 6846 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 00:09:58.816601 6846 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:09:58.816549 6846 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0312 00:09:58.816556 6846 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.831892 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.845377 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.866680 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.884954 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:09:59Z is after 2025-08-24T17:21:41Z" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.892183 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.892219 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.892229 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.892244 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.892255 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:59Z","lastTransitionTime":"2026-03-12T00:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.994706 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.994753 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.994764 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.994784 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:09:59 crc kubenswrapper[4870]: I0312 00:09:59.994795 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:09:59Z","lastTransitionTime":"2026-03-12T00:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.097885 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.097949 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.097967 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.097994 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.098014 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.125869 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.142569 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.158974 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.172334 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.185071 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.200566 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.201368 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.201409 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.201421 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.201439 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.201452 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.213449 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.229438 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.243456 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.256979 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.278381 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7621c9113864acb7e4e10e1171e1d05bf9a8c581f790e1751128d8d0ce19c045\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:09:56Z\\\",\\\"message\\\":\\\"ller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 00:09:56.466538 6498 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 00:09:56.466626 6498 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:09:56.468197 6498 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0312 00:09:56.468244 6498 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 00:09:56.468268 6498 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:09:56.468277 6498 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:09:56.468326 6498 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:09:56.468403 6498 factory.go:656] Stopping watch factory\\\\nI0312 00:09:56.468435 6498 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:09:56.466728 6498 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 00:09:56.468482 6498 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:09:56.468499 6498 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 00:09:56.468513 6498 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 00:09:56.468528 6498 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:09:56.468543 6498 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:09:58Z\\\",\\\"message\\\":\\\"removal\\\\nI0312 00:09:58.808413 6846 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:09:58.808499 6846 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0312 00:09:58.808466 6846 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 00:09:58.808538 6846 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:09:58.808563 6846 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 00:09:58.816427 6846 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:09:58.816443 6846 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:09:58.816460 6846 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:09:58.816482 6846 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:09:58.816486 6846 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0312 00:09:58.816499 6846 factory.go:656] Stopping watch factory\\\\nI0312 00:09:58.816511 6846 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:09:58.816535 6846 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 00:09:58.816601 6846 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:09:58.816549 6846 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0312 00:09:58.816556 6846 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.292380 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.304508 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.308099 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.308166 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.308179 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.308196 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.308209 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.314464 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.339707 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.349857 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.378978 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.411330 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.411356 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.411365 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.411378 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.411387 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.514515 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.514569 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.514586 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.514612 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.514630 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.604471 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/1.log" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.608398 4870 scope.go:117] "RemoveContainer" containerID="4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7" Mar 12 00:10:00 crc kubenswrapper[4870]: E0312 00:10:00.608539 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.618098 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.618124 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.618135 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.618170 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.618183 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.619783 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.630805 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.648106 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:09:58Z\\\",\\\"message\\\":\\\"removal\\\\nI0312 00:09:58.808413 6846 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:09:58.808499 6846 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0312 00:09:58.808466 6846 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 00:09:58.808538 6846 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:09:58.808563 6846 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 00:09:58.816427 6846 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:09:58.816443 6846 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:09:58.816460 6846 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:09:58.816482 6846 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:09:58.816486 6846 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0312 00:09:58.816499 6846 factory.go:656] Stopping watch factory\\\\nI0312 00:09:58.816511 6846 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:09:58.816535 6846 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 00:09:58.816601 6846 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:09:58.816549 6846 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0312 00:09:58.816556 6846 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.659289 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.669219 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.685460 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.696925 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.725003 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.725088 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.725130 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.725195 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.725222 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.726202 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.739696 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.739912 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.739935 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.739943 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.739955 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.739965 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.749445 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: E0312 00:10:00.752707 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.756251 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.756290 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.756299 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.756313 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.756324 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: E0312 00:10:00.767335 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.769799 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.770600 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.770637 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.770646 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.770659 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.770668 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.782705 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: E0312 00:10:00.799298 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.803170 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.803198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.803207 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.803221 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.803230 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.830013 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: E0312 00:10:00.838314 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.841656 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.841707 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.841718 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.841733 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.841746 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.857355 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: E0312 00:10:00.874727 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: E0312 00:10:00.874987 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.876593 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.876646 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.876661 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.876678 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.876689 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.886587 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.899706 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.910693 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.978669 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.978742 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.978765 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.978790 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:00 crc kubenswrapper[4870]: I0312 00:10:00.978811 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:00Z","lastTransitionTime":"2026-03-12T00:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.081750 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.081801 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.081820 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.081845 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.081865 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:01Z","lastTransitionTime":"2026-03-12T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.104790 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:01 crc kubenswrapper[4870]: E0312 00:10:01.104900 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.104931 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.104943 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:01 crc kubenswrapper[4870]: E0312 00:10:01.105050 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.105125 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:01 crc kubenswrapper[4870]: E0312 00:10:01.105272 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:01 crc kubenswrapper[4870]: E0312 00:10:01.105374 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.184697 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.184732 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.184741 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.184754 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.184763 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:01Z","lastTransitionTime":"2026-03-12T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.288044 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.288135 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.288209 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.288241 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.288262 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:01Z","lastTransitionTime":"2026-03-12T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.391439 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.391514 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.391536 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.391568 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.391739 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:01Z","lastTransitionTime":"2026-03-12T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.493981 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.494036 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.494053 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.494077 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.494093 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:01Z","lastTransitionTime":"2026-03-12T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.597430 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.597472 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.597484 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.597522 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.597537 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:01Z","lastTransitionTime":"2026-03-12T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.700030 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.700104 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.700127 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.700193 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.700217 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:01Z","lastTransitionTime":"2026-03-12T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.803270 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.803314 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.803323 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.803336 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.803347 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:01Z","lastTransitionTime":"2026-03-12T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.906488 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.906536 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.906545 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.906560 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:01 crc kubenswrapper[4870]: I0312 00:10:01.906572 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:01Z","lastTransitionTime":"2026-03-12T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.009077 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.009204 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.009227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.009253 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.009271 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:02Z","lastTransitionTime":"2026-03-12T00:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.112440 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.112488 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.112498 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.112512 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.112522 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:02Z","lastTransitionTime":"2026-03-12T00:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.216950 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.217007 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.217024 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.217046 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.217063 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:02Z","lastTransitionTime":"2026-03-12T00:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.324526 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.324618 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.324647 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.324680 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.324715 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:02Z","lastTransitionTime":"2026-03-12T00:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.427533 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.427828 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.427942 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.428017 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.428086 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:02Z","lastTransitionTime":"2026-03-12T00:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.530884 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.530960 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.530983 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.531011 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.531041 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:02Z","lastTransitionTime":"2026-03-12T00:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.634938 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.635000 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.635022 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.635052 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.635074 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:02Z","lastTransitionTime":"2026-03-12T00:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.737172 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.737227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.737246 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.737268 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.737285 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:02Z","lastTransitionTime":"2026-03-12T00:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.838998 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.839036 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.839075 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.839091 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.839100 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:02Z","lastTransitionTime":"2026-03-12T00:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.941652 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.941696 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.941706 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.941719 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.941728 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:02Z","lastTransitionTime":"2026-03-12T00:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.957597 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.957996 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:02 crc kubenswrapper[4870]: E0312 00:10:02.958106 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:10:18.958071151 +0000 UTC m=+109.561487491 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:10:02 crc kubenswrapper[4870]: E0312 00:10:02.958179 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:10:02 crc kubenswrapper[4870]: E0312 00:10:02.958249 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:18.958230765 +0000 UTC m=+109.561647075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:10:02 crc kubenswrapper[4870]: I0312 00:10:02.958339 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:02 crc kubenswrapper[4870]: E0312 00:10:02.958391 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:10:02 crc kubenswrapper[4870]: E0312 00:10:02.958413 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:18.95840675 +0000 UTC m=+109.561823060 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.045278 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.045373 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.045394 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.045429 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.045465 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:03Z","lastTransitionTime":"2026-03-12T00:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.059898 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.059963 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.060024 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.060215 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.060275 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.060308 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.060327 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.060323 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.060377 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.060399 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.060332 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs podName:5c62c8d9-0f6b-4ec4-af08-fae75fb41288 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:19.060295426 +0000 UTC m=+109.663711776 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs") pod "network-metrics-daemon-xkrk6" (UID: "5c62c8d9-0f6b-4ec4-af08-fae75fb41288") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.060502 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:19.060475541 +0000 UTC m=+109.663891881 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.060526 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:19.060513892 +0000 UTC m=+109.663930242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.104880 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.105465 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.105508 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.105589 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.105733 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.105947 4870 scope.go:117] "RemoveContainer" containerID="87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0" Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.106016 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.106248 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.106290 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 00:10:03 crc kubenswrapper[4870]: E0312 00:10:03.106350 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.147693 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.147743 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.147756 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.147774 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.147788 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:03Z","lastTransitionTime":"2026-03-12T00:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.251050 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.251095 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.251105 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.251120 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.251128 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:03Z","lastTransitionTime":"2026-03-12T00:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.354794 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.354840 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.354855 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.354869 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.354880 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:03Z","lastTransitionTime":"2026-03-12T00:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.457821 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.457855 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.457868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.457884 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.457895 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:03Z","lastTransitionTime":"2026-03-12T00:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.560872 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.560948 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.560970 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.560998 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.561017 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:03Z","lastTransitionTime":"2026-03-12T00:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.664034 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.664118 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.664182 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.664215 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.664239 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:03Z","lastTransitionTime":"2026-03-12T00:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.766759 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.766819 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.766835 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.766878 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.766896 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:03Z","lastTransitionTime":"2026-03-12T00:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.869609 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.869660 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.869678 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.869701 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.869717 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:03Z","lastTransitionTime":"2026-03-12T00:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.972237 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.972298 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.972310 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.972327 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:03 crc kubenswrapper[4870]: I0312 00:10:03.972363 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:03Z","lastTransitionTime":"2026-03-12T00:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.074737 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.074790 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.074812 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.074841 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.074861 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:04Z","lastTransitionTime":"2026-03-12T00:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.177992 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.178034 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.178051 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.178072 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.178083 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:04Z","lastTransitionTime":"2026-03-12T00:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.220054 4870 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.281057 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.281182 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.281213 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.281243 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.281266 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:04Z","lastTransitionTime":"2026-03-12T00:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.384227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.384261 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.384272 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.384285 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.384295 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:04Z","lastTransitionTime":"2026-03-12T00:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.487206 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.487250 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.487265 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.487288 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.487769 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:04Z","lastTransitionTime":"2026-03-12T00:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.591682 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.591758 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.591783 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.591811 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.591828 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:04Z","lastTransitionTime":"2026-03-12T00:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.694613 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.694662 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.694679 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.694700 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.694717 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:04Z","lastTransitionTime":"2026-03-12T00:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.798022 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.798077 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.798099 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.798126 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.798191 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:04Z","lastTransitionTime":"2026-03-12T00:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.901313 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.901411 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.901435 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.901462 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:04 crc kubenswrapper[4870]: I0312 00:10:04.901482 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:04Z","lastTransitionTime":"2026-03-12T00:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.004446 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.004508 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.004534 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.004562 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.004599 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:05Z","lastTransitionTime":"2026-03-12T00:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.103885 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.103892 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:05 crc kubenswrapper[4870]: E0312 00:10:05.104104 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.103918 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:05 crc kubenswrapper[4870]: E0312 00:10:05.104296 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.103922 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:05 crc kubenswrapper[4870]: E0312 00:10:05.104475 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:05 crc kubenswrapper[4870]: E0312 00:10:05.104802 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.110259 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.110307 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.110318 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.110411 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.110427 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:05Z","lastTransitionTime":"2026-03-12T00:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.213681 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.213733 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.213742 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.213756 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.213768 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:05Z","lastTransitionTime":"2026-03-12T00:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.319449 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.319494 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.319503 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.319517 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.319526 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:05Z","lastTransitionTime":"2026-03-12T00:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.421642 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.421697 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.421710 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.421729 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.421746 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:05Z","lastTransitionTime":"2026-03-12T00:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.525045 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.525113 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.525136 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.525213 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.525239 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:05Z","lastTransitionTime":"2026-03-12T00:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.627254 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.627288 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.627298 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.627315 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.627325 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:05Z","lastTransitionTime":"2026-03-12T00:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.730291 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.730339 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.730353 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.730370 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.730383 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:05Z","lastTransitionTime":"2026-03-12T00:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.832433 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.832488 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.832500 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.832516 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.832530 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:05Z","lastTransitionTime":"2026-03-12T00:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.936379 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.936434 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.936448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.936469 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:05 crc kubenswrapper[4870]: I0312 00:10:05.936487 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:05Z","lastTransitionTime":"2026-03-12T00:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.039499 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.039576 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.039594 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.039621 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.039639 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:06Z","lastTransitionTime":"2026-03-12T00:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.143042 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.143093 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.143108 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.143128 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.143272 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:06Z","lastTransitionTime":"2026-03-12T00:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.246474 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.246559 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.246579 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.246607 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.246626 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:06Z","lastTransitionTime":"2026-03-12T00:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.276848 4870 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.349126 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.349190 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.349200 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.349215 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.349227 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:06Z","lastTransitionTime":"2026-03-12T00:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.452351 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.452429 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.452446 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.452468 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.452486 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:06Z","lastTransitionTime":"2026-03-12T00:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.554832 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.554881 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.554896 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.554912 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.554972 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:06Z","lastTransitionTime":"2026-03-12T00:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.657985 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.658040 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.658062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.658087 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.658104 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:06Z","lastTransitionTime":"2026-03-12T00:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.760394 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.760440 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.760449 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.760467 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.760477 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:06Z","lastTransitionTime":"2026-03-12T00:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.863647 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.863689 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.863700 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.863721 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.863733 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:06Z","lastTransitionTime":"2026-03-12T00:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.966685 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.966729 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.966737 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.966752 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:06 crc kubenswrapper[4870]: I0312 00:10:06.966765 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:06Z","lastTransitionTime":"2026-03-12T00:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.071227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.071297 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.071315 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.071340 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.071357 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:07Z","lastTransitionTime":"2026-03-12T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.104213 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.104307 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.104265 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.104265 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:07 crc kubenswrapper[4870]: E0312 00:10:07.104460 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:07 crc kubenswrapper[4870]: E0312 00:10:07.104618 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:07 crc kubenswrapper[4870]: E0312 00:10:07.104741 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:07 crc kubenswrapper[4870]: E0312 00:10:07.104833 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.174491 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.174589 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.174603 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.174625 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.174646 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:07Z","lastTransitionTime":"2026-03-12T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.280549 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.280625 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.280648 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.280681 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.280704 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:07Z","lastTransitionTime":"2026-03-12T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.383728 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.383803 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.383830 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.383860 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.383884 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:07Z","lastTransitionTime":"2026-03-12T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.487714 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.487762 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.487774 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.487789 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.487800 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:07Z","lastTransitionTime":"2026-03-12T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.590933 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.590976 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.590997 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.591014 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.591025 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:07Z","lastTransitionTime":"2026-03-12T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.694929 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.695001 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.695018 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.695045 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.695063 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:07Z","lastTransitionTime":"2026-03-12T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.798268 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.798335 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.798358 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.798384 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.798406 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:07Z","lastTransitionTime":"2026-03-12T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.901089 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.901202 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.901214 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.901227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:07 crc kubenswrapper[4870]: I0312 00:10:07.901237 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:07Z","lastTransitionTime":"2026-03-12T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.005104 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.005191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.005204 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.005222 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.005238 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:08Z","lastTransitionTime":"2026-03-12T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.108285 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.108346 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.108363 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.108415 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.108432 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:08Z","lastTransitionTime":"2026-03-12T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.211330 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.211400 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.211417 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.211440 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.211458 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:08Z","lastTransitionTime":"2026-03-12T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.314370 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.314418 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.314431 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.314448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.314461 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:08Z","lastTransitionTime":"2026-03-12T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.418073 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.418176 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.418196 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.418218 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.418233 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:08Z","lastTransitionTime":"2026-03-12T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.522109 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.522232 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.522250 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.522280 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.522298 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:08Z","lastTransitionTime":"2026-03-12T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.625593 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.625668 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.625688 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.625713 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.625731 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:08Z","lastTransitionTime":"2026-03-12T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.729472 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.729578 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.729597 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.729625 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.729646 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:08Z","lastTransitionTime":"2026-03-12T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.832857 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.832930 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.832947 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.832971 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.832988 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:08Z","lastTransitionTime":"2026-03-12T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.936474 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.936528 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.936545 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.936566 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:08 crc kubenswrapper[4870]: I0312 00:10:08.936584 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:08Z","lastTransitionTime":"2026-03-12T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.040774 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.040846 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.040868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.040896 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.040919 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:09Z","lastTransitionTime":"2026-03-12T00:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.104757 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.104812 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.104772 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:09 crc kubenswrapper[4870]: E0312 00:10:09.104955 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.104967 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:09 crc kubenswrapper[4870]: E0312 00:10:09.105048 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:09 crc kubenswrapper[4870]: E0312 00:10:09.105136 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:09 crc kubenswrapper[4870]: E0312 00:10:09.105314 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.144717 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.144844 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.144863 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.144887 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.144937 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:09Z","lastTransitionTime":"2026-03-12T00:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.247959 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.248029 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.248050 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.248086 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.248109 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:09Z","lastTransitionTime":"2026-03-12T00:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.354926 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.355078 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.355095 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.355121 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.355189 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:09Z","lastTransitionTime":"2026-03-12T00:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.458257 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.458323 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.458340 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.458366 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.458385 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:09Z","lastTransitionTime":"2026-03-12T00:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.561816 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.561928 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.561951 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.561979 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.562002 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:09Z","lastTransitionTime":"2026-03-12T00:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.665417 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.665488 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.665509 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.665540 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.665562 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:09Z","lastTransitionTime":"2026-03-12T00:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.768766 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.769243 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.769434 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.769585 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.769713 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:09Z","lastTransitionTime":"2026-03-12T00:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.873223 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.873290 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.873307 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.873333 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.873350 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:09Z","lastTransitionTime":"2026-03-12T00:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.975995 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.976075 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.976097 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.976129 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:09 crc kubenswrapper[4870]: I0312 00:10:09.976179 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:09Z","lastTransitionTime":"2026-03-12T00:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.079241 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.079296 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.079315 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.079337 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.079353 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:10Z","lastTransitionTime":"2026-03-12T00:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.131515 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.165856 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.182551 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.182624 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.182661 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.182686 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.182704 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:10Z","lastTransitionTime":"2026-03-12T00:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.184875 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.196985 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.216296 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.239535 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.260114 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.278999 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.285089 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.285200 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.286337 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.286383 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.286395 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:10Z","lastTransitionTime":"2026-03-12T00:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.296213 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.311790 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.332522 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:09:58Z\\\",\\\"message\\\":\\\"removal\\\\nI0312 00:09:58.808413 6846 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:09:58.808499 6846 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0312 00:09:58.808466 6846 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 00:09:58.808538 6846 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:09:58.808563 6846 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 00:09:58.816427 6846 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:09:58.816443 6846 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:09:58.816460 6846 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:09:58.816482 6846 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:09:58.816486 6846 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0312 00:09:58.816499 6846 factory.go:656] Stopping watch factory\\\\nI0312 00:09:58.816511 6846 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:09:58.816535 6846 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 00:09:58.816601 6846 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:09:58.816549 6846 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0312 00:09:58.816556 6846 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.347381 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.361682 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.378595 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.389312 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.389381 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.389403 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.389429 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.389445 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:10Z","lastTransitionTime":"2026-03-12T00:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.397718 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.421999 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.436962 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:10Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.492969 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.493045 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.493068 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.493101 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.493127 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:10Z","lastTransitionTime":"2026-03-12T00:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.596225 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.596599 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.596891 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.597092 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.597307 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:10Z","lastTransitionTime":"2026-03-12T00:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.699680 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.699716 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.699726 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.699741 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.699755 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:10Z","lastTransitionTime":"2026-03-12T00:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.802052 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.802137 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.802199 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.802224 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.802244 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:10Z","lastTransitionTime":"2026-03-12T00:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.905317 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.905642 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.905847 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.905992 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:10 crc kubenswrapper[4870]: I0312 00:10:10.906121 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:10Z","lastTransitionTime":"2026-03-12T00:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.009829 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.009901 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.009921 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.009951 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.009971 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.104137 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.104229 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.104278 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:11 crc kubenswrapper[4870]: E0312 00:10:11.104371 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.104182 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:11 crc kubenswrapper[4870]: E0312 00:10:11.104516 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:11 crc kubenswrapper[4870]: E0312 00:10:11.104647 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:11 crc kubenswrapper[4870]: E0312 00:10:11.104763 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.119661 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.119814 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.119839 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.119867 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.119889 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.145234 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.145346 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.145372 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.145401 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.145421 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: E0312 00:10:11.165829 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:11Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.170444 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.170598 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.170621 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.170647 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.170668 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: E0312 00:10:11.190363 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:11Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.195179 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.195226 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.195238 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.195254 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.195266 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: E0312 00:10:11.213316 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:11Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.218058 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.218176 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.218198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.218222 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.218242 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: E0312 00:10:11.238010 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:11Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.243029 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.243102 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.243119 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.243178 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.243202 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: E0312 00:10:11.257298 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:11Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:11 crc kubenswrapper[4870]: E0312 00:10:11.257442 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.259682 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.259727 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.259744 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.260091 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.260126 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.362761 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.362818 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.362835 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.362856 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.362871 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.466026 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.466171 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.466197 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.466225 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.466244 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.569982 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.570062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.570092 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.570123 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.570140 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.673810 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.673882 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.673902 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.673927 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.673944 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.776851 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.776929 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.776955 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.776986 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.777010 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.880321 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.880393 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.880414 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.880446 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.880463 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.984310 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.984367 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.984384 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.984407 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:11 crc kubenswrapper[4870]: I0312 00:10:11.984424 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:11Z","lastTransitionTime":"2026-03-12T00:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.087125 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.087317 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.087336 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.087360 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.087378 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:12Z","lastTransitionTime":"2026-03-12T00:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.190083 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.190643 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.190789 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.190971 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.191108 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:12Z","lastTransitionTime":"2026-03-12T00:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.295052 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.295113 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.295133 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.295199 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.295221 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:12Z","lastTransitionTime":"2026-03-12T00:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.398660 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.398728 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.398747 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.398770 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.398788 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:12Z","lastTransitionTime":"2026-03-12T00:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.502763 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.502817 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.502844 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.502873 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.502895 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:12Z","lastTransitionTime":"2026-03-12T00:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.606518 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.607013 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.607188 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.607326 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.607448 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:12Z","lastTransitionTime":"2026-03-12T00:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.711181 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.711518 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.711724 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.711897 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.712068 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:12Z","lastTransitionTime":"2026-03-12T00:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.815439 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.815502 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.815525 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.815551 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.815571 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:12Z","lastTransitionTime":"2026-03-12T00:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.918672 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.918735 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.918755 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.918784 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:12 crc kubenswrapper[4870]: I0312 00:10:12.918804 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:12Z","lastTransitionTime":"2026-03-12T00:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.022467 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.022548 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.022572 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.022601 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.022623 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:13Z","lastTransitionTime":"2026-03-12T00:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.104727 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.104726 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.104822 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.104891 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:13 crc kubenswrapper[4870]: E0312 00:10:13.105049 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:13 crc kubenswrapper[4870]: E0312 00:10:13.105269 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:13 crc kubenswrapper[4870]: E0312 00:10:13.105410 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:13 crc kubenswrapper[4870]: E0312 00:10:13.105618 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.126253 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.126304 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.126326 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.126355 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.126376 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:13Z","lastTransitionTime":"2026-03-12T00:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.229667 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.230059 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.230138 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.230267 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.230354 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:13Z","lastTransitionTime":"2026-03-12T00:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.333414 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.333869 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.334025 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.334213 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.334402 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:13Z","lastTransitionTime":"2026-03-12T00:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.438344 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.438810 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.438980 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.439268 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.439457 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:13Z","lastTransitionTime":"2026-03-12T00:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.543120 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.543766 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.543966 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.544223 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.544406 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:13Z","lastTransitionTime":"2026-03-12T00:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.647534 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.647612 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.647630 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.647655 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.647676 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:13Z","lastTransitionTime":"2026-03-12T00:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.750618 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.750657 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.750670 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.750687 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.750699 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:13Z","lastTransitionTime":"2026-03-12T00:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.854051 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.854095 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.854107 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.854125 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.854168 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:13Z","lastTransitionTime":"2026-03-12T00:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.959653 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.959703 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.959715 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.959734 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:13 crc kubenswrapper[4870]: I0312 00:10:13.959747 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:13Z","lastTransitionTime":"2026-03-12T00:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.063206 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.063254 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.063267 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.063284 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.063295 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:14Z","lastTransitionTime":"2026-03-12T00:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.105733 4870 scope.go:117] "RemoveContainer" containerID="4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.166915 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.166978 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.166998 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.167028 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.167049 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:14Z","lastTransitionTime":"2026-03-12T00:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.269046 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.269198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.269224 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.269257 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.269279 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:14Z","lastTransitionTime":"2026-03-12T00:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.371862 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.371942 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.371966 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.372001 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.372022 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:14Z","lastTransitionTime":"2026-03-12T00:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.475408 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.475473 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.475494 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.475519 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.475537 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:14Z","lastTransitionTime":"2026-03-12T00:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.579185 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.579249 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.579264 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.579296 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.579312 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:14Z","lastTransitionTime":"2026-03-12T00:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.659427 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/1.log" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.662326 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.662873 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.682221 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.682253 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.682262 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.682277 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.682285 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:14Z","lastTransitionTime":"2026-03-12T00:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.686387 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.703865 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.718938 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.733570 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.752040 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.766557 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.785207 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.785245 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.785258 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.785274 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.785288 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:14Z","lastTransitionTime":"2026-03-12T00:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.789835 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:09:58Z\\\",\\\"message\\\":\\\"removal\\\\nI0312 00:09:58.808413 6846 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:09:58.808499 6846 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0312 00:09:58.808466 6846 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 00:09:58.808538 6846 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:09:58.808563 6846 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 00:09:58.816427 6846 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:09:58.816443 6846 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:09:58.816460 6846 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:09:58.816482 6846 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:09:58.816486 6846 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0312 00:09:58.816499 6846 factory.go:656] Stopping watch factory\\\\nI0312 00:09:58.816511 6846 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:09:58.816535 6846 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 00:09:58.816601 6846 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:09:58.816549 6846 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0312 00:09:58.816556 6846 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.804486 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.816971 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.843977 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.858186 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.871183 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.881352 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.887910 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.887943 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.887953 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.887967 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.887976 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:14Z","lastTransitionTime":"2026-03-12T00:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.893250 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.907232 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.916368 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.927844 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:14Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.990048 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.990077 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.990085 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.990098 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:14 crc kubenswrapper[4870]: I0312 00:10:14.990107 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:14Z","lastTransitionTime":"2026-03-12T00:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.093274 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.093333 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.093353 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.093377 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.093397 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:15Z","lastTransitionTime":"2026-03-12T00:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.104723 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.104723 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.104791 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:15 crc kubenswrapper[4870]: E0312 00:10:15.104896 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.104931 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:15 crc kubenswrapper[4870]: E0312 00:10:15.105029 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:15 crc kubenswrapper[4870]: E0312 00:10:15.105245 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:15 crc kubenswrapper[4870]: E0312 00:10:15.105368 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.198186 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.198246 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.198263 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.198287 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.198305 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:15Z","lastTransitionTime":"2026-03-12T00:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.301686 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.301748 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.301764 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.301787 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.301804 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:15Z","lastTransitionTime":"2026-03-12T00:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.404913 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.404963 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.404979 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.405003 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.405020 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:15Z","lastTransitionTime":"2026-03-12T00:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.507971 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.508063 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.508086 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.508109 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.508125 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:15Z","lastTransitionTime":"2026-03-12T00:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.611122 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.611213 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.611232 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.611255 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.611273 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:15Z","lastTransitionTime":"2026-03-12T00:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.667806 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/2.log" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.668748 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/1.log" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.672412 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817" exitCode=1 Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.672471 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.672547 4870 scope.go:117] "RemoveContainer" containerID="4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.673666 4870 scope.go:117] "RemoveContainer" containerID="2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817" Mar 12 00:10:15 crc kubenswrapper[4870]: E0312 00:10:15.673986 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.696372 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.710887 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.714590 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.714717 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.714735 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.714805 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.714817 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:15Z","lastTransitionTime":"2026-03-12T00:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.744915 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.759965 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.782023 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.804770 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.818728 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.818794 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.818819 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.818849 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.818872 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:15Z","lastTransitionTime":"2026-03-12T00:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.823942 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.839180 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.852568 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.864835 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.875686 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.890070 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.905747 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.920855 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.920887 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.920896 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.920909 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.920920 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:15Z","lastTransitionTime":"2026-03-12T00:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.926076 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.938700 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.956109 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b4d8fc89f7957683ca921243b8206e4f05d5c7ab815909b2f756614607bd4b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:09:58Z\\\",\\\"message\\\":\\\"removal\\\\nI0312 00:09:58.808413 6846 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:09:58.808499 6846 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0312 00:09:58.808466 6846 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 00:09:58.808538 6846 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:09:58.808563 6846 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 00:09:58.816427 6846 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:09:58.816443 6846 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:09:58.816460 6846 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:09:58.816482 6846 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:09:58.816486 6846 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0312 00:09:58.816499 6846 factory.go:656] Stopping watch factory\\\\nI0312 00:09:58.816511 6846 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:09:58.816535 6846 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 00:09:58.816601 6846 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:09:58.816549 6846 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0312 00:09:58.816556 6846 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:15 crc kubenswrapper[4870]: I0312 00:10:15.968045 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:15Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.024096 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.024167 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.024185 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.024207 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.024223 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:16Z","lastTransitionTime":"2026-03-12T00:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.105522 4870 scope.go:117] "RemoveContainer" containerID="87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.129633 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.129668 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.129677 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.129693 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.129701 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:16Z","lastTransitionTime":"2026-03-12T00:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.232944 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.232984 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.232996 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.233012 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.233025 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:16Z","lastTransitionTime":"2026-03-12T00:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.335555 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.335983 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.335994 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.336010 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.336022 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:16Z","lastTransitionTime":"2026-03-12T00:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.438855 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.438942 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.438958 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.438981 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.438998 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:16Z","lastTransitionTime":"2026-03-12T00:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.542315 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.542380 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.542398 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.542423 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.542443 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:16Z","lastTransitionTime":"2026-03-12T00:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.645671 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.645734 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.645754 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.645779 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.645799 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:16Z","lastTransitionTime":"2026-03-12T00:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.680680 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/2.log" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.688515 4870 scope.go:117] "RemoveContainer" containerID="2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817" Mar 12 00:10:16 crc kubenswrapper[4870]: E0312 00:10:16.688990 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.689394 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.693991 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.694587 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.708835 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.729066 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.745867 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.750677 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.750724 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.750736 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.750756 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.750771 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:16Z","lastTransitionTime":"2026-03-12T00:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.771780 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.796094 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.812125 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.832372 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.849073 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.853992 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.854034 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.854045 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.854062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.854072 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:16Z","lastTransitionTime":"2026-03-12T00:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.863559 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.881359 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.896042 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.908586 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.920323 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.937861 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.947979 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.956669 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.956697 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.956705 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.956720 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.956731 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:16Z","lastTransitionTime":"2026-03-12T00:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.961501 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.978424 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:16 crc kubenswrapper[4870]: I0312 00:10:16.996224 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:16Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.008543 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.023583 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.038632 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.053928 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.058988 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.059022 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.059035 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.059051 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.059066 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:17Z","lastTransitionTime":"2026-03-12T00:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.065576 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.077251 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.090889 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.103973 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.104044 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.103983 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:17 crc kubenswrapper[4870]: E0312 00:10:17.104137 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:17 crc kubenswrapper[4870]: E0312 00:10:17.104270 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.104239 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: E0312 00:10:17.104483 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.104851 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:17 crc kubenswrapper[4870]: E0312 00:10:17.105072 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.114932 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.130742 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.141562 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.158249 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.162310 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.162334 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.162361 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.162374 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.162383 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:17Z","lastTransitionTime":"2026-03-12T00:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.169829 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.198520 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.220526 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.235747 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:17Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.265258 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.265287 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.265297 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.265313 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.265324 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:17Z","lastTransitionTime":"2026-03-12T00:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.368681 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.368749 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.368767 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.368793 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.368815 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:17Z","lastTransitionTime":"2026-03-12T00:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.471922 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.471971 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.471987 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.472009 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.472028 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:17Z","lastTransitionTime":"2026-03-12T00:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.574553 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.574585 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.574593 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.574606 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.574614 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:17Z","lastTransitionTime":"2026-03-12T00:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.677493 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.677532 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.677540 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.677552 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.677561 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:17Z","lastTransitionTime":"2026-03-12T00:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.780282 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.780340 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.780359 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.780382 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.780402 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:17Z","lastTransitionTime":"2026-03-12T00:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.883756 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.883807 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.883824 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.883848 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.883866 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:17Z","lastTransitionTime":"2026-03-12T00:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.986931 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.986992 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.987009 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.987031 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:17 crc kubenswrapper[4870]: I0312 00:10:17.987048 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:17Z","lastTransitionTime":"2026-03-12T00:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.090299 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.090362 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.090380 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.090402 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.090419 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:18Z","lastTransitionTime":"2026-03-12T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.193191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.193237 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.193249 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.193266 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.193278 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:18Z","lastTransitionTime":"2026-03-12T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.296591 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.296649 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.296666 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.296690 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.296709 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:18Z","lastTransitionTime":"2026-03-12T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.399997 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.400057 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.400073 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.400096 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.400113 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:18Z","lastTransitionTime":"2026-03-12T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.503377 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.503426 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.503479 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.503502 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.503520 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:18Z","lastTransitionTime":"2026-03-12T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.606622 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.606751 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.606773 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.606796 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.606814 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:18Z","lastTransitionTime":"2026-03-12T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.709943 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.709998 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.710016 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.710049 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.710076 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:18Z","lastTransitionTime":"2026-03-12T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.814018 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.814069 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.814085 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.814108 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.814126 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:18Z","lastTransitionTime":"2026-03-12T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.917645 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.917698 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.917717 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.917750 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:18 crc kubenswrapper[4870]: I0312 00:10:18.917767 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:18Z","lastTransitionTime":"2026-03-12T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.021476 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.021546 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.021563 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.021585 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.021604 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:19Z","lastTransitionTime":"2026-03-12T00:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.043594 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.043785 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:10:51.043753476 +0000 UTC m=+141.647169826 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.043848 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.043927 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.044064 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.044067 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.044117 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:51.044106416 +0000 UTC m=+141.647522826 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.044202 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:51.044127817 +0000 UTC m=+141.647544177 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.104872 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.104975 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.104972 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.104914 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.105193 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.105359 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.105541 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.105672 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.124838 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.124896 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.124914 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.124938 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.124957 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:19Z","lastTransitionTime":"2026-03-12T00:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.144791 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.144857 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.144927 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.145015 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.145109 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs podName:5c62c8d9-0f6b-4ec4-af08-fae75fb41288 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:51.145082996 +0000 UTC m=+141.748499336 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs") pod "network-metrics-daemon-xkrk6" (UID: "5c62c8d9-0f6b-4ec4-af08-fae75fb41288") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.145132 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.145638 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.145663 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.145782 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:51.145747275 +0000 UTC m=+141.749163625 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.149419 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.149489 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.149520 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:19 crc kubenswrapper[4870]: E0312 00:10:19.149633 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 00:10:51.149602125 +0000 UTC m=+141.753018485 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.228868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.228928 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.228945 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.228966 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.228983 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:19Z","lastTransitionTime":"2026-03-12T00:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.332903 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.332946 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.332955 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.332969 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.332978 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:19Z","lastTransitionTime":"2026-03-12T00:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.435727 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.435783 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.435795 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.435811 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.435823 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:19Z","lastTransitionTime":"2026-03-12T00:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.539551 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.539620 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.539653 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.539680 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.539702 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:19Z","lastTransitionTime":"2026-03-12T00:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.642637 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.642678 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.642689 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.642703 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.642715 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:19Z","lastTransitionTime":"2026-03-12T00:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.745190 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.745252 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.745275 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.745300 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.745321 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:19Z","lastTransitionTime":"2026-03-12T00:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.848710 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.848753 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.848764 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.848780 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.848791 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:19Z","lastTransitionTime":"2026-03-12T00:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.952235 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.952298 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.952315 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.952339 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:19 crc kubenswrapper[4870]: I0312 00:10:19.952356 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:19Z","lastTransitionTime":"2026-03-12T00:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.055725 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.055784 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.055801 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.055827 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.055844 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:20Z","lastTransitionTime":"2026-03-12T00:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.129706 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.147373 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.162294 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.162363 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.162385 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.162452 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.162477 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:20Z","lastTransitionTime":"2026-03-12T00:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.184050 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.200803 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.224901 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.245661 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.265395 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.265449 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.265464 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.265485 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.265500 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:20Z","lastTransitionTime":"2026-03-12T00:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.267513 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.281427 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.293456 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.309425 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.323848 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.334928 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.348856 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.362126 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.368278 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.368323 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.368341 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.368364 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.368381 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:20Z","lastTransitionTime":"2026-03-12T00:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.379656 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.395631 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.412783 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:20Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.470402 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.470460 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.470480 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.470503 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.470519 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:20Z","lastTransitionTime":"2026-03-12T00:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.572677 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.572735 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.572758 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.572785 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.572807 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:20Z","lastTransitionTime":"2026-03-12T00:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.674906 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.675223 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.675299 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.675372 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.675433 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:20Z","lastTransitionTime":"2026-03-12T00:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.778379 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.778453 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.778473 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.778500 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.778522 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:20Z","lastTransitionTime":"2026-03-12T00:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.881533 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.881590 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.881607 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.881630 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.881646 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:20Z","lastTransitionTime":"2026-03-12T00:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.984747 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.984817 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.984834 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.984856 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:20 crc kubenswrapper[4870]: I0312 00:10:20.984876 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:20Z","lastTransitionTime":"2026-03-12T00:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.087816 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.087938 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.087966 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.088035 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.088056 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.104540 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.104551 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.104692 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:21 crc kubenswrapper[4870]: E0312 00:10:21.104856 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.104904 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:21 crc kubenswrapper[4870]: E0312 00:10:21.105177 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:21 crc kubenswrapper[4870]: E0312 00:10:21.105291 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:21 crc kubenswrapper[4870]: E0312 00:10:21.105503 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.191671 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.191737 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.191789 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.191813 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.191831 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.295363 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.295859 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.296050 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.296237 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.296365 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.399990 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.400054 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.400072 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.400096 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.400114 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.502820 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.502882 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.502899 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.502925 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.502942 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.606500 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.606606 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.606624 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.606649 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.606668 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.608289 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.608360 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.608383 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.608408 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.608429 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: E0312 00:10:21.628600 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:21Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.634518 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.634571 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.634593 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.634622 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.634647 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: E0312 00:10:21.654724 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:21Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.660766 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.660828 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.660851 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.660882 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.660905 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: E0312 00:10:21.691682 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:21Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.696670 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.696730 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.696748 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.696774 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.696791 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: E0312 00:10:21.721278 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:21Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.726758 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.726820 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.726841 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.726867 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.726884 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: E0312 00:10:21.747878 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:21Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:21 crc kubenswrapper[4870]: E0312 00:10:21.748559 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.751105 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.751203 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.751226 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.751257 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.751280 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.854745 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.854795 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.854811 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.854834 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.854859 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.959140 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.959272 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.959305 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.959337 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:21 crc kubenswrapper[4870]: I0312 00:10:21.959359 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:21Z","lastTransitionTime":"2026-03-12T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.062107 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.062224 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.062251 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.062278 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.062296 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:22Z","lastTransitionTime":"2026-03-12T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.165572 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.165648 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.165671 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.165697 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.165719 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:22Z","lastTransitionTime":"2026-03-12T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.269465 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.269549 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.269569 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.269601 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.269627 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:22Z","lastTransitionTime":"2026-03-12T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.372422 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.372483 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.372503 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.372529 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.372550 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:22Z","lastTransitionTime":"2026-03-12T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.475121 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.475235 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.475259 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.475288 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.475311 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:22Z","lastTransitionTime":"2026-03-12T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.580826 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.581353 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.581509 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.581642 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.581783 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:22Z","lastTransitionTime":"2026-03-12T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.685202 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.685723 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.685884 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.686073 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.686264 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:22Z","lastTransitionTime":"2026-03-12T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.790089 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.790173 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.790195 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.790218 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.790237 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:22Z","lastTransitionTime":"2026-03-12T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.893738 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.893793 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.893811 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.893833 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.893851 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:22Z","lastTransitionTime":"2026-03-12T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.997680 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.997753 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.997775 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.997805 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:22 crc kubenswrapper[4870]: I0312 00:10:22.997829 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:22Z","lastTransitionTime":"2026-03-12T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.101050 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.101111 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.101131 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.101188 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.101206 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:23Z","lastTransitionTime":"2026-03-12T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.104445 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.104518 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.104539 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:23 crc kubenswrapper[4870]: E0312 00:10:23.104674 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.104686 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:23 crc kubenswrapper[4870]: E0312 00:10:23.104803 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:23 crc kubenswrapper[4870]: E0312 00:10:23.104902 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:23 crc kubenswrapper[4870]: E0312 00:10:23.105097 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.203721 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.203780 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.203797 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.203820 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.203889 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:23Z","lastTransitionTime":"2026-03-12T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.306876 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.306920 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.306930 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.306946 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.306956 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:23Z","lastTransitionTime":"2026-03-12T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.409348 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.409406 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.409423 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.409445 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.409463 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:23Z","lastTransitionTime":"2026-03-12T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.512993 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.513054 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.513070 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.513094 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.513111 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:23Z","lastTransitionTime":"2026-03-12T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.617099 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.617183 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.617200 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.617227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.617245 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:23Z","lastTransitionTime":"2026-03-12T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.719867 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.719933 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.720015 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.720044 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.720068 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:23Z","lastTransitionTime":"2026-03-12T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.823315 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.823379 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.823402 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.823425 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.823442 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:23Z","lastTransitionTime":"2026-03-12T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.925933 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.926477 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.926705 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.926870 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:23 crc kubenswrapper[4870]: I0312 00:10:23.927001 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:23Z","lastTransitionTime":"2026-03-12T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.030104 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.030536 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.030669 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.030785 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.030911 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:24Z","lastTransitionTime":"2026-03-12T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.120207 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.133646 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.133899 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.134035 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.134206 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.134368 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:24Z","lastTransitionTime":"2026-03-12T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.237557 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.237619 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.237637 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.237662 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.237681 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:24Z","lastTransitionTime":"2026-03-12T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.340432 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.340495 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.340513 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.340539 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.340556 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:24Z","lastTransitionTime":"2026-03-12T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.443569 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.444040 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.444312 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.444514 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.444666 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:24Z","lastTransitionTime":"2026-03-12T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.550383 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.550425 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.550438 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.550454 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.550466 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:24Z","lastTransitionTime":"2026-03-12T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.653295 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.653351 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.653367 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.653389 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.653407 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:24Z","lastTransitionTime":"2026-03-12T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.757935 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.758021 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.758045 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.758075 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.758096 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:24Z","lastTransitionTime":"2026-03-12T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.862608 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.863090 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.863290 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.863450 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.863605 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:24Z","lastTransitionTime":"2026-03-12T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.966555 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.966621 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.966640 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.966665 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:24 crc kubenswrapper[4870]: I0312 00:10:24.966683 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:24Z","lastTransitionTime":"2026-03-12T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.070921 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.071011 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.071036 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.071071 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.071093 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:25Z","lastTransitionTime":"2026-03-12T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.104591 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.104652 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:25 crc kubenswrapper[4870]: E0312 00:10:25.104779 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.104605 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:25 crc kubenswrapper[4870]: E0312 00:10:25.104968 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.105039 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:25 crc kubenswrapper[4870]: E0312 00:10:25.105122 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:25 crc kubenswrapper[4870]: E0312 00:10:25.105249 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.174550 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.174595 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.174607 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.174622 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.174636 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:25Z","lastTransitionTime":"2026-03-12T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.277733 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.278167 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.278316 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.278455 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.278579 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:25Z","lastTransitionTime":"2026-03-12T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.381279 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.381353 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.381371 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.381396 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.381414 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:25Z","lastTransitionTime":"2026-03-12T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.485198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.485282 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.485306 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.485338 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.485367 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:25Z","lastTransitionTime":"2026-03-12T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.588716 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.588787 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.588806 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.588831 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.588850 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:25Z","lastTransitionTime":"2026-03-12T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.691603 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.691947 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.692051 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.692131 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.692238 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:25Z","lastTransitionTime":"2026-03-12T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.795114 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.795192 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.795209 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.795232 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.795249 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:25Z","lastTransitionTime":"2026-03-12T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.897513 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.897577 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.897594 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.897620 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:25 crc kubenswrapper[4870]: I0312 00:10:25.897639 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:25Z","lastTransitionTime":"2026-03-12T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.001036 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.001483 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.001672 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.001805 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.001935 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:26Z","lastTransitionTime":"2026-03-12T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.105089 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.105529 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.105612 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.105683 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.105769 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:26Z","lastTransitionTime":"2026-03-12T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.208246 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.208969 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.209202 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.209354 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.209419 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:26Z","lastTransitionTime":"2026-03-12T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.311854 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.312238 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.312400 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.312506 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.312608 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:26Z","lastTransitionTime":"2026-03-12T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.415447 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.415519 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.415538 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.415563 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.415581 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:26Z","lastTransitionTime":"2026-03-12T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.518637 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.518699 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.518717 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.518743 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.518761 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:26Z","lastTransitionTime":"2026-03-12T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.622118 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.622212 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.622230 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.622254 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.622273 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:26Z","lastTransitionTime":"2026-03-12T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.726129 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.726218 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.726236 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.726262 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.726280 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:26Z","lastTransitionTime":"2026-03-12T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.829576 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.829638 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.829656 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.829682 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.829702 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:26Z","lastTransitionTime":"2026-03-12T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.933728 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.934204 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.934404 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.934556 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:26 crc kubenswrapper[4870]: I0312 00:10:26.934683 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:26Z","lastTransitionTime":"2026-03-12T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.038874 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.038977 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.038997 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.039025 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.039044 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:27Z","lastTransitionTime":"2026-03-12T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.104596 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.104653 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:27 crc kubenswrapper[4870]: E0312 00:10:27.104764 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.104828 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:27 crc kubenswrapper[4870]: E0312 00:10:27.104936 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:27 crc kubenswrapper[4870]: E0312 00:10:27.105077 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.105357 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:27 crc kubenswrapper[4870]: E0312 00:10:27.105522 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.140271 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.141970 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.142032 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.142051 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.142072 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.142090 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:27Z","lastTransitionTime":"2026-03-12T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.167857 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.185161 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.218593 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.238082 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a987707e-6301-41cb-94d9-cd805b7f0eb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:02Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 00:08:32.366455 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 00:08:32.369622 1 observer_polling.go:159] Starting file observer\\\\nI0312 00:08:32.411682 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 00:08:32.423693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 00:09:02.789865 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 00:09:02.789955 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b313d591964488dc11e1187fb6a17b31328efcfe337ce61ed7306339b909e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdff4f4329aa0155db3cc76e9d76500e1592262853f81ede71e5391d07f5b5f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.245410 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.245487 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.245503 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.245528 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.245545 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:27Z","lastTransitionTime":"2026-03-12T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.259576 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.284076 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.308899 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.327996 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.348210 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.350068 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.350263 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.350354 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.350454 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.350557 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:27Z","lastTransitionTime":"2026-03-12T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.367449 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.387725 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.403508 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.424756 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.443752 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.454468 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.454521 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.454536 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.454556 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.454572 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:27Z","lastTransitionTime":"2026-03-12T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.463622 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.482973 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.515386 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.534177 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:27Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.557855 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.558075 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.558249 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.558389 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.558516 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:27Z","lastTransitionTime":"2026-03-12T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.662137 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.662238 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.662257 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.662283 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.662300 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:27Z","lastTransitionTime":"2026-03-12T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.766330 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.766383 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.766404 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.766434 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.766458 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:27Z","lastTransitionTime":"2026-03-12T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.869279 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.869342 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.869387 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.869415 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.869433 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:27Z","lastTransitionTime":"2026-03-12T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.972292 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.972362 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.972384 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.972415 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:27 crc kubenswrapper[4870]: I0312 00:10:27.972438 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:27Z","lastTransitionTime":"2026-03-12T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.075997 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.076068 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.076117 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.076202 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.076230 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:28Z","lastTransitionTime":"2026-03-12T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.179658 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.179712 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.179732 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.179753 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.179771 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:28Z","lastTransitionTime":"2026-03-12T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.282890 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.282959 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.282975 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.282999 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.283017 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:28Z","lastTransitionTime":"2026-03-12T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.386178 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.386264 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.386289 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.386316 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.386332 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:28Z","lastTransitionTime":"2026-03-12T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.490266 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.490346 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.490370 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.490400 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.490423 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:28Z","lastTransitionTime":"2026-03-12T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.593957 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.594011 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.594025 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.594044 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.594054 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:28Z","lastTransitionTime":"2026-03-12T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.697385 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.697467 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.697485 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.697511 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.697529 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:28Z","lastTransitionTime":"2026-03-12T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.800951 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.801015 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.801027 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.801047 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.801063 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:28Z","lastTransitionTime":"2026-03-12T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.903982 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.904058 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.904080 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.904113 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:28 crc kubenswrapper[4870]: I0312 00:10:28.904132 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:28Z","lastTransitionTime":"2026-03-12T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.006341 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.006406 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.006426 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.006451 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.006467 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:29Z","lastTransitionTime":"2026-03-12T00:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.104543 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:29 crc kubenswrapper[4870]: E0312 00:10:29.104723 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.104779 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.104795 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.105440 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:29 crc kubenswrapper[4870]: E0312 00:10:29.105583 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:29 crc kubenswrapper[4870]: E0312 00:10:29.105672 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:29 crc kubenswrapper[4870]: E0312 00:10:29.105844 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.106192 4870 scope.go:117] "RemoveContainer" containerID="2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817" Mar 12 00:10:29 crc kubenswrapper[4870]: E0312 00:10:29.106561 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.110043 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.110099 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.110123 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.110180 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.110205 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:29Z","lastTransitionTime":"2026-03-12T00:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.214222 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.214696 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.214716 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.214738 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.214818 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:29Z","lastTransitionTime":"2026-03-12T00:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.318382 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.318460 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.318485 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.318517 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.318539 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:29Z","lastTransitionTime":"2026-03-12T00:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.420996 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.421030 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.421039 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.421052 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.421060 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:29Z","lastTransitionTime":"2026-03-12T00:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.525049 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.525135 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.525208 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.525243 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.525269 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:29Z","lastTransitionTime":"2026-03-12T00:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.629090 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.629210 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.629238 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.629270 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.629295 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:29Z","lastTransitionTime":"2026-03-12T00:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.731852 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.731912 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.731928 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.731953 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.731970 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:29Z","lastTransitionTime":"2026-03-12T00:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.834687 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.834718 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.834727 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.834739 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.834748 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:29Z","lastTransitionTime":"2026-03-12T00:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.938454 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.938485 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.938494 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.938508 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:29 crc kubenswrapper[4870]: I0312 00:10:29.938517 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:29Z","lastTransitionTime":"2026-03-12T00:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.041023 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.041055 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.041063 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.041076 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.041085 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:30Z","lastTransitionTime":"2026-03-12T00:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.131582 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: E0312 00:10:30.141811 4870 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.153904 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a987707e-6301-41cb-94d9-cd805b7f0eb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:02Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 00:08:32.366455 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 00:08:32.369622 1 observer_polling.go:159] Starting file observer\\\\nI0312 00:08:32.411682 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 00:08:32.423693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 00:09:02.789865 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 00:09:02.789955 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b313d591964488dc11e1187fb6a17b31328efcfe337ce61ed7306339b909e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdff4f4329aa0155db3cc76e9d76500e1592262853f81ede71e5391d07f5b5f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.168406 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.186847 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.207937 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: E0312 00:10:30.211568 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.227422 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.243697 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.257975 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.274093 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.288916 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.308226 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.330374 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.342611 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.355364 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.378363 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.387641 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.400064 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:30 crc kubenswrapper[4870]: I0312 00:10:30.409198 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:30Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.104437 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.104558 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:31 crc kubenswrapper[4870]: E0312 00:10:31.104669 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.104753 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:31 crc kubenswrapper[4870]: E0312 00:10:31.104949 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:31 crc kubenswrapper[4870]: E0312 00:10:31.105077 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.105244 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:31 crc kubenswrapper[4870]: E0312 00:10:31.105365 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.883893 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.884019 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.884041 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.884065 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.884082 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:31Z","lastTransitionTime":"2026-03-12T00:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:31 crc kubenswrapper[4870]: E0312 00:10:31.905208 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:31Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.911393 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.911485 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.911505 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.911534 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.911552 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:31Z","lastTransitionTime":"2026-03-12T00:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:31 crc kubenswrapper[4870]: E0312 00:10:31.932793 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:31Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.940189 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.940239 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.940290 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.940318 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.940337 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:31Z","lastTransitionTime":"2026-03-12T00:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:31 crc kubenswrapper[4870]: E0312 00:10:31.965815 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:31Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.971993 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.972200 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.972280 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.972355 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:31 crc kubenswrapper[4870]: I0312 00:10:31.972481 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:31Z","lastTransitionTime":"2026-03-12T00:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:31 crc kubenswrapper[4870]: E0312 00:10:31.995719 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:31Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:32 crc kubenswrapper[4870]: I0312 00:10:32.002306 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:32 crc kubenswrapper[4870]: I0312 00:10:32.002374 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:32 crc kubenswrapper[4870]: I0312 00:10:32.002398 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:32 crc kubenswrapper[4870]: I0312 00:10:32.002427 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:32 crc kubenswrapper[4870]: I0312 00:10:32.002522 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:32Z","lastTransitionTime":"2026-03-12T00:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:32 crc kubenswrapper[4870]: E0312 00:10:32.020944 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:32Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:32 crc kubenswrapper[4870]: E0312 00:10:32.021129 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:10:33 crc kubenswrapper[4870]: I0312 00:10:33.104640 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:33 crc kubenswrapper[4870]: I0312 00:10:33.104738 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:33 crc kubenswrapper[4870]: I0312 00:10:33.104800 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:33 crc kubenswrapper[4870]: I0312 00:10:33.104651 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:33 crc kubenswrapper[4870]: E0312 00:10:33.104878 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:33 crc kubenswrapper[4870]: E0312 00:10:33.104966 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:33 crc kubenswrapper[4870]: E0312 00:10:33.105133 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:33 crc kubenswrapper[4870]: E0312 00:10:33.105294 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.104308 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.104399 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.104348 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.104321 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:35 crc kubenswrapper[4870]: E0312 00:10:35.104576 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:35 crc kubenswrapper[4870]: E0312 00:10:35.104689 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:35 crc kubenswrapper[4870]: E0312 00:10:35.104902 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:35 crc kubenswrapper[4870]: E0312 00:10:35.105063 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:35 crc kubenswrapper[4870]: E0312 00:10:35.212415 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.767360 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hngl_2ad1e98a-cb66-436d-8e5e-301724f70769/kube-multus/0.log" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.767439 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ad1e98a-cb66-436d-8e5e-301724f70769" containerID="1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2" exitCode=1 Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.767480 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hngl" event={"ID":"2ad1e98a-cb66-436d-8e5e-301724f70769","Type":"ContainerDied","Data":"1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2"} Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.768185 4870 scope.go:117] "RemoveContainer" containerID="1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.793238 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.807089 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.818889 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.839773 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:34Z\\\",\\\"message\\\":\\\"2026-03-12T00:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf\\\\n2026-03-12T00:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf to /host/opt/cni/bin/\\\\n2026-03-12T00:09:49Z [verbose] multus-daemon started\\\\n2026-03-12T00:09:49Z [verbose] Readiness Indicator file check\\\\n2026-03-12T00:10:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.854675 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.867600 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.880592 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.896173 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.909640 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.930066 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.946323 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.956936 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.970758 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.986245 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:35 crc kubenswrapper[4870]: I0312 00:10:35.998416 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:35Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.008528 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.028427 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.049081 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a987707e-6301-41cb-94d9-cd805b7f0eb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:02Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 00:08:32.366455 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 00:08:32.369622 1 observer_polling.go:159] Starting file observer\\\\nI0312 00:08:32.411682 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 00:08:32.423693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 00:09:02.789865 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 00:09:02.789955 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b313d591964488dc11e1187fb6a17b31328efcfe337ce61ed7306339b909e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdff4f4329aa0155db3cc76e9d76500e1592262853f81ede71e5391d07f5b5f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.774015 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hngl_2ad1e98a-cb66-436d-8e5e-301724f70769/kube-multus/0.log" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.774087 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hngl" event={"ID":"2ad1e98a-cb66-436d-8e5e-301724f70769","Type":"ContainerStarted","Data":"c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae"} Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.796878 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.815038 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.835466 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.853510 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.869444 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.888760 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:34Z\\\",\\\"message\\\":\\\"2026-03-12T00:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf\\\\n2026-03-12T00:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf to /host/opt/cni/bin/\\\\n2026-03-12T00:09:49Z [verbose] multus-daemon started\\\\n2026-03-12T00:09:49Z [verbose] Readiness Indicator file check\\\\n2026-03-12T00:10:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.901938 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.920749 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.936792 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.953740 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:36 crc kubenswrapper[4870]: I0312 00:10:36.970105 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.000063 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:36Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.017416 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:37Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.040028 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:37Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.055399 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:37Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.087890 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:37Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.104248 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.104337 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:37 crc kubenswrapper[4870]: E0312 00:10:37.104403 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.104421 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.104483 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:37 crc kubenswrapper[4870]: E0312 00:10:37.104607 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:37 crc kubenswrapper[4870]: E0312 00:10:37.104846 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:37 crc kubenswrapper[4870]: E0312 00:10:37.104950 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.109599 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a987707e-6301-41cb-94d9-cd805b7f0eb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:02Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 00:08:32.366455 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 00:08:32.369622 1 observer_polling.go:159] Starting file observer\\\\nI0312 00:08:32.411682 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 00:08:32.423693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 00:09:02.789865 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 00:09:02.789955 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b313d591964488dc11e1187fb6a17b31328efcfe337ce61ed7306339b909e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdff4f4329aa0155db3cc76e9d76500e1592262853f81ede71e5391d07f5b5f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:37Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:37 crc kubenswrapper[4870]: I0312 00:10:37.126138 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:37Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:39 crc kubenswrapper[4870]: I0312 00:10:39.104919 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:39 crc kubenswrapper[4870]: I0312 00:10:39.104980 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:39 crc kubenswrapper[4870]: I0312 00:10:39.104958 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:39 crc kubenswrapper[4870]: I0312 00:10:39.104929 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:39 crc kubenswrapper[4870]: E0312 00:10:39.105197 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:39 crc kubenswrapper[4870]: E0312 00:10:39.105309 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:39 crc kubenswrapper[4870]: E0312 00:10:39.105457 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:39 crc kubenswrapper[4870]: E0312 00:10:39.105711 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.126693 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.147256 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.163973 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.189547 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.204593 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: E0312 00:10:40.213120 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.219357 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.232373 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.245555 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.256690 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.271430 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.291332 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.305478 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a987707e-6301-41cb-94d9-cd805b7f0eb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:02Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 00:08:32.366455 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 00:08:32.369622 1 observer_polling.go:159] Starting file observer\\\\nI0312 00:08:32.411682 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 00:08:32.423693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 00:09:02.789865 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 00:09:02.789955 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b313d591964488dc11e1187fb6a17b31328efcfe337ce61ed7306339b909e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdff4f4329aa0155db3cc76e9d76500e1592262853f81ede71e5391d07f5b5f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.324389 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.338828 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.350120 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.371414 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:34Z\\\",\\\"message\\\":\\\"2026-03-12T00:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf\\\\n2026-03-12T00:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf to /host/opt/cni/bin/\\\\n2026-03-12T00:09:49Z [verbose] multus-daemon started\\\\n2026-03-12T00:09:49Z [verbose] Readiness Indicator file check\\\\n2026-03-12T00:10:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.389406 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:40 crc kubenswrapper[4870]: I0312 00:10:40.410283 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:40Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.104747 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.104798 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.104797 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.104747 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:41 crc kubenswrapper[4870]: E0312 00:10:41.104931 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:41 crc kubenswrapper[4870]: E0312 00:10:41.105327 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:41 crc kubenswrapper[4870]: E0312 00:10:41.106230 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:41 crc kubenswrapper[4870]: E0312 00:10:41.106600 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.106632 4870 scope.go:117] "RemoveContainer" containerID="2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.796764 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/2.log" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.800409 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6"} Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.800862 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.832717 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.849242 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a987707e-6301-41cb-94d9-cd805b7f0eb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:02Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 00:08:32.366455 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 00:08:32.369622 1 observer_polling.go:159] Starting file observer\\\\nI0312 00:08:32.411682 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 00:08:32.423693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 00:09:02.789865 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 00:09:02.789955 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b313d591964488dc11e1187fb6a17b31328efcfe337ce61ed7306339b909e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdff4f4329aa0155db3cc76e9d76500e1592262853f81ede71e5391d07f5b5f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.862206 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.881694 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.902933 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.916469 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.933500 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.945265 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.967065 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:34Z\\\",\\\"message\\\":\\\"2026-03-12T00:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf\\\\n2026-03-12T00:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf to /host/opt/cni/bin/\\\\n2026-03-12T00:09:49Z [verbose] multus-daemon started\\\\n2026-03-12T00:09:49Z [verbose] Readiness Indicator file check\\\\n2026-03-12T00:10:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.979611 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:41 crc kubenswrapper[4870]: I0312 00:10:41.998821 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:41Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.012061 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.029269 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.044307 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.071945 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.087557 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.112062 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.131320 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.306866 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.306900 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.306909 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.306922 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.306935 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:42Z","lastTransitionTime":"2026-03-12T00:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:42 crc kubenswrapper[4870]: E0312 00:10:42.325687 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.330002 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.330031 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.330040 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.330054 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.330062 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:42Z","lastTransitionTime":"2026-03-12T00:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:42 crc kubenswrapper[4870]: E0312 00:10:42.347024 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.351158 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.351191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.351200 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.351215 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.351224 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:42Z","lastTransitionTime":"2026-03-12T00:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:42 crc kubenswrapper[4870]: E0312 00:10:42.367385 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.371557 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.371587 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.371596 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.371610 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.371622 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:42Z","lastTransitionTime":"2026-03-12T00:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:42 crc kubenswrapper[4870]: E0312 00:10:42.389475 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.394226 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.394273 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.394286 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.394312 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.394327 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:42Z","lastTransitionTime":"2026-03-12T00:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:42 crc kubenswrapper[4870]: E0312 00:10:42.408494 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: E0312 00:10:42.408786 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.806675 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/3.log" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.807808 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/2.log" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.812066 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" exitCode=1 Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.812136 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6"} Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.812248 4870 scope.go:117] "RemoveContainer" containerID="2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.813476 4870 scope.go:117] "RemoveContainer" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:10:42 crc kubenswrapper[4870]: E0312 00:10:42.814814 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.831387 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.845797 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.864829 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.890018 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.954378 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a987707e-6301-41cb-94d9-cd805b7f0eb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:02Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 00:08:32.366455 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 00:08:32.369622 1 observer_polling.go:159] Starting file observer\\\\nI0312 00:08:32.411682 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 00:08:32.423693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 00:09:02.789865 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 00:09:02.789955 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b313d591964488dc11e1187fb6a17b31328efcfe337ce61ed7306339b909e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdff4f4329aa0155db3cc76e9d76500e1592262853f81ede71e5391d07f5b5f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.976309 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:42 crc kubenswrapper[4870]: I0312 00:10:42.993542 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:42Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.007744 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.023165 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:34Z\\\",\\\"message\\\":\\\"2026-03-12T00:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf\\\\n2026-03-12T00:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf to /host/opt/cni/bin/\\\\n2026-03-12T00:09:49Z [verbose] multus-daemon started\\\\n2026-03-12T00:09:49Z [verbose] Readiness Indicator file check\\\\n2026-03-12T00:10:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.043057 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.058662 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.076977 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.096105 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.104433 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.104459 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:43 crc kubenswrapper[4870]: E0312 00:10:43.104634 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:43 crc kubenswrapper[4870]: E0312 00:10:43.104748 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.104459 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:43 crc kubenswrapper[4870]: E0312 00:10:43.104893 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.105018 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:43 crc kubenswrapper[4870]: E0312 00:10:43.105254 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.113932 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.147607 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fe3e39497afc7fa8e9f5cc7b696a38a881b2c4eec39a12d02410262e9ef5817\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:15Z\\\",\\\"message\\\":\\\"ce_openshift-controller-manager/controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.149:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {cab7c637-a021-4a4d-a4b9-06d63c44316f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0312 00:10:15.093476 7066 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 00:10:15.093827 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0312 00:10:15.093866 7066 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:15.093904 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 00:10:15.093943 7066 factory.go:656] Stopping watch factory\\\\nI0312 00:10:15.093950 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:15.093958 7066 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:15.093987 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:15.094005 7066 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 00:10:15.094093 7066 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"ler 2 for removal\\\\nI0312 00:10:42.208921 7372 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:10:42.208946 7372 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 00:10:42.208979 7372 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 00:10:42.209021 7372 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 00:10:42.209032 7372 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:42.209040 7372 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0312 00:10:42.209048 7372 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:42.209215 7372 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 00:10:42.211238 7372 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:42.212274 7372 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 00:10:42.212341 7372 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:10:42.212390 7372 factory.go:656] Stopping watch factory\\\\nI0312 00:10:42.212418 7372 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:42.212537 7372 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.164501 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.181581 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.199247 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.819821 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/3.log" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.826903 4870 scope.go:117] "RemoveContainer" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:10:43 crc kubenswrapper[4870]: E0312 00:10:43.827172 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.845039 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.875883 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"ler 2 for removal\\\\nI0312 00:10:42.208921 7372 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:10:42.208946 7372 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 00:10:42.208979 7372 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 00:10:42.209021 7372 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 00:10:42.209032 7372 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:42.209040 7372 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0312 00:10:42.209048 7372 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:42.209215 7372 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 00:10:42.211238 7372 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:42.212274 7372 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 00:10:42.212341 7372 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:10:42.212390 7372 factory.go:656] Stopping watch factory\\\\nI0312 00:10:42.212418 7372 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:42.212537 7372 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.894962 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.911301 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.931636 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.951064 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.968800 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:43 crc kubenswrapper[4870]: I0312 00:10:43.989192 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:43Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:44 crc kubenswrapper[4870]: I0312 00:10:44.005138 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:44Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:44 crc kubenswrapper[4870]: I0312 00:10:44.041318 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:44Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:44 crc kubenswrapper[4870]: I0312 00:10:44.061489 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a987707e-6301-41cb-94d9-cd805b7f0eb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:02Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 00:08:32.366455 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 00:08:32.369622 1 observer_polling.go:159] Starting file observer\\\\nI0312 00:08:32.411682 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 00:08:32.423693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 00:09:02.789865 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 00:09:02.789955 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b313d591964488dc11e1187fb6a17b31328efcfe337ce61ed7306339b909e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdff4f4329aa0155db3cc76e9d76500e1592262853f81ede71e5391d07f5b5f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:44Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:44 crc kubenswrapper[4870]: I0312 00:10:44.077124 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:44Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:44 crc kubenswrapper[4870]: I0312 00:10:44.094080 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:44Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:44 crc kubenswrapper[4870]: I0312 00:10:44.117969 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:34Z\\\",\\\"message\\\":\\\"2026-03-12T00:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf\\\\n2026-03-12T00:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf to /host/opt/cni/bin/\\\\n2026-03-12T00:09:49Z [verbose] multus-daemon started\\\\n2026-03-12T00:09:49Z [verbose] Readiness Indicator file check\\\\n2026-03-12T00:10:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:44Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:44 crc kubenswrapper[4870]: I0312 00:10:44.140242 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:44Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:44 crc kubenswrapper[4870]: I0312 00:10:44.159592 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:44Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:44 crc kubenswrapper[4870]: I0312 00:10:44.175737 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:44Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:44 crc kubenswrapper[4870]: I0312 00:10:44.187996 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:44Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:45 crc kubenswrapper[4870]: I0312 00:10:45.104475 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:45 crc kubenswrapper[4870]: I0312 00:10:45.104550 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:45 crc kubenswrapper[4870]: I0312 00:10:45.104592 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:45 crc kubenswrapper[4870]: I0312 00:10:45.104598 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:45 crc kubenswrapper[4870]: E0312 00:10:45.104761 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:45 crc kubenswrapper[4870]: E0312 00:10:45.104885 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:45 crc kubenswrapper[4870]: E0312 00:10:45.105052 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:45 crc kubenswrapper[4870]: E0312 00:10:45.105316 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:45 crc kubenswrapper[4870]: E0312 00:10:45.214736 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:10:47 crc kubenswrapper[4870]: I0312 00:10:47.104790 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:47 crc kubenswrapper[4870]: I0312 00:10:47.104878 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:47 crc kubenswrapper[4870]: E0312 00:10:47.104945 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:47 crc kubenswrapper[4870]: E0312 00:10:47.105030 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:47 crc kubenswrapper[4870]: I0312 00:10:47.105087 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:47 crc kubenswrapper[4870]: E0312 00:10:47.105130 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:47 crc kubenswrapper[4870]: I0312 00:10:47.105172 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:47 crc kubenswrapper[4870]: E0312 00:10:47.105210 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:49 crc kubenswrapper[4870]: I0312 00:10:49.104276 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:49 crc kubenswrapper[4870]: I0312 00:10:49.104290 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:49 crc kubenswrapper[4870]: I0312 00:10:49.104339 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:49 crc kubenswrapper[4870]: E0312 00:10:49.104993 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:49 crc kubenswrapper[4870]: E0312 00:10:49.104818 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:49 crc kubenswrapper[4870]: I0312 00:10:49.104496 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:49 crc kubenswrapper[4870]: E0312 00:10:49.105173 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:49 crc kubenswrapper[4870]: E0312 00:10:49.105392 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:49 crc kubenswrapper[4870]: I0312 00:10:49.118209 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.128275 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.151047 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.164858 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a987707e-6301-41cb-94d9-cd805b7f0eb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:02Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 00:08:32.366455 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 00:08:32.369622 1 observer_polling.go:159] Starting file observer\\\\nI0312 00:08:32.411682 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 00:08:32.423693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 00:09:02.789865 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 00:09:02.789955 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b313d591964488dc11e1187fb6a17b31328efcfe337ce61ed7306339b909e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdff4f4329aa0155db3cc76e9d76500e1592262853f81ede71e5391d07f5b5f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.178933 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0a6b45-136a-40ae-9c17-5d08ffbc73fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7785c12c9b7a092fc3ff5d8929307966c9890e1f1b577f50ef115a330d7dbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05afc014f8bfef6841d868de1e130de19b22469b9ac07013bb1519248336ecfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f17248a7c5b4217a907f04f122eeb15deef06693d907293f527f6bd1748ee48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca600cedaefbe690245c5250e29b404ca53d99b0dac03891f81b1e878082704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fca600cedaefbe690245c5250e29b404ca53d99b0dac03891f81b1e878082704\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.191565 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.202985 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: E0312 00:10:50.215425 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.221999 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:34Z\\\",\\\"message\\\":\\\"2026-03-12T00:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf\\\\n2026-03-12T00:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf to /host/opt/cni/bin/\\\\n2026-03-12T00:09:49Z [verbose] multus-daemon started\\\\n2026-03-12T00:09:49Z [verbose] Readiness Indicator file check\\\\n2026-03-12T00:10:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.242120 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.259764 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.273857 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.285716 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.333511 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.361028 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"ler 2 for removal\\\\nI0312 00:10:42.208921 7372 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:10:42.208946 7372 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 00:10:42.208979 7372 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 00:10:42.209021 7372 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 00:10:42.209032 7372 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:42.209040 7372 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0312 00:10:42.209048 7372 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:42.209215 7372 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 00:10:42.211238 7372 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:42.212274 7372 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 00:10:42.212341 7372 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:10:42.212390 7372 factory.go:656] Stopping watch factory\\\\nI0312 00:10:42.212418 7372 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:42.212537 7372 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.381480 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.397048 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.414293 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.431975 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.456006 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:50 crc kubenswrapper[4870]: I0312 00:10:50.475057 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:50Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:51 crc kubenswrapper[4870]: I0312 00:10:51.104233 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:51 crc kubenswrapper[4870]: I0312 00:10:51.104233 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.104371 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:51 crc kubenswrapper[4870]: I0312 00:10:51.104262 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:51 crc kubenswrapper[4870]: I0312 00:10:51.104233 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.104452 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.104592 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.104717 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:51 crc kubenswrapper[4870]: I0312 00:10:51.139854 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:10:51 crc kubenswrapper[4870]: I0312 00:10:51.140020 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.140072 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.14003964 +0000 UTC m=+205.743455960 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:10:51 crc kubenswrapper[4870]: I0312 00:10:51.140191 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.140197 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.140274 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.140252646 +0000 UTC m=+205.743668986 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.140325 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.140367 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.140358609 +0000 UTC m=+205.743774939 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 00:10:51 crc kubenswrapper[4870]: I0312 00:10:51.241052 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:51 crc kubenswrapper[4870]: I0312 00:10:51.241116 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:51 crc kubenswrapper[4870]: I0312 00:10:51.241212 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.241348 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.241398 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.241428 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.241447 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.241486 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs podName:5c62c8d9-0f6b-4ec4-af08-fae75fb41288 nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.241454419 +0000 UTC m=+205.844870759 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs") pod "network-metrics-daemon-xkrk6" (UID: "5c62c8d9-0f6b-4ec4-af08-fae75fb41288") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.241479 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.241523 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.24150026 +0000 UTC m=+205.844916610 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.241540 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.241564 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:51 crc kubenswrapper[4870]: E0312 00:10:51.241670 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.241632234 +0000 UTC m=+205.845048584 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.716749 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.716809 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.716832 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.716863 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.716885 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:52Z","lastTransitionTime":"2026-03-12T00:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:52 crc kubenswrapper[4870]: E0312 00:10:52.737208 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.748375 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.748525 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.748614 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.748703 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.748787 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:52Z","lastTransitionTime":"2026-03-12T00:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:52 crc kubenswrapper[4870]: E0312 00:10:52.766459 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.771382 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.771417 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.771430 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.771448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.771460 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:52Z","lastTransitionTime":"2026-03-12T00:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:52 crc kubenswrapper[4870]: E0312 00:10:52.789963 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.795360 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.795394 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.795406 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.795420 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.795432 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:52Z","lastTransitionTime":"2026-03-12T00:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:52 crc kubenswrapper[4870]: E0312 00:10:52.813288 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.818386 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.818533 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.818562 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.818585 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:10:52 crc kubenswrapper[4870]: I0312 00:10:52.818603 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:10:52Z","lastTransitionTime":"2026-03-12T00:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:10:52 crc kubenswrapper[4870]: E0312 00:10:52.837034 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:10:52Z is after 2025-08-24T17:21:41Z" Mar 12 00:10:52 crc kubenswrapper[4870]: E0312 00:10:52.837313 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:10:53 crc kubenswrapper[4870]: I0312 00:10:53.104886 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:53 crc kubenswrapper[4870]: I0312 00:10:53.105013 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:53 crc kubenswrapper[4870]: I0312 00:10:53.105222 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:53 crc kubenswrapper[4870]: E0312 00:10:53.105259 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:53 crc kubenswrapper[4870]: I0312 00:10:53.105290 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:53 crc kubenswrapper[4870]: E0312 00:10:53.105365 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:53 crc kubenswrapper[4870]: E0312 00:10:53.105549 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:53 crc kubenswrapper[4870]: E0312 00:10:53.105821 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:55 crc kubenswrapper[4870]: I0312 00:10:55.104493 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:55 crc kubenswrapper[4870]: I0312 00:10:55.104559 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:55 crc kubenswrapper[4870]: I0312 00:10:55.104566 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:55 crc kubenswrapper[4870]: E0312 00:10:55.104782 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:55 crc kubenswrapper[4870]: I0312 00:10:55.104809 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:55 crc kubenswrapper[4870]: E0312 00:10:55.104917 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:55 crc kubenswrapper[4870]: E0312 00:10:55.105484 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:55 crc kubenswrapper[4870]: E0312 00:10:55.105380 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:55 crc kubenswrapper[4870]: E0312 00:10:55.217322 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:10:57 crc kubenswrapper[4870]: I0312 00:10:57.104756 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:57 crc kubenswrapper[4870]: I0312 00:10:57.104796 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:57 crc kubenswrapper[4870]: I0312 00:10:57.104804 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:57 crc kubenswrapper[4870]: I0312 00:10:57.104842 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:57 crc kubenswrapper[4870]: E0312 00:10:57.104960 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:57 crc kubenswrapper[4870]: E0312 00:10:57.105122 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:57 crc kubenswrapper[4870]: E0312 00:10:57.105223 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:57 crc kubenswrapper[4870]: E0312 00:10:57.105284 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:59 crc kubenswrapper[4870]: I0312 00:10:59.104805 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:10:59 crc kubenswrapper[4870]: I0312 00:10:59.104844 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:10:59 crc kubenswrapper[4870]: I0312 00:10:59.104904 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:10:59 crc kubenswrapper[4870]: I0312 00:10:59.104969 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:10:59 crc kubenswrapper[4870]: E0312 00:10:59.105789 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:10:59 crc kubenswrapper[4870]: E0312 00:10:59.105978 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:10:59 crc kubenswrapper[4870]: E0312 00:10:59.106203 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:10:59 crc kubenswrapper[4870]: E0312 00:10:59.106350 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:10:59 crc kubenswrapper[4870]: I0312 00:10:59.106480 4870 scope.go:117] "RemoveContainer" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:10:59 crc kubenswrapper[4870]: E0312 00:10:59.106856 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.121880 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217db5b4-2e71-4611-8091-53f047a1b1e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48960b6fec414bb7ab395b92cf9c04066787ffa47002f5d973d031acb9d0a817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81be0bf0cd249ed2da8e24ba73dce70a548f6880f3c0f6be877fa601711c219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bn5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wrxrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.142050 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467385e2-3bbf-4cf0-909a-8e878b5d86dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:42Z\\\",\\\"message\\\":\\\"ler 2 for removal\\\\nI0312 00:10:42.208921 7372 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 00:10:42.208946 7372 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 00:10:42.208979 7372 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 00:10:42.209021 7372 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 00:10:42.209032 7372 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 00:10:42.209040 7372 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0312 00:10:42.209048 7372 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 00:10:42.209215 7372 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 00:10:42.211238 7372 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 00:10:42.212274 7372 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 00:10:42.212341 7372 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 00:10:42.212390 7372 factory.go:656] Stopping watch factory\\\\nI0312 00:10:42.212418 7372 ovnkube.go:599] Stopped ovnkube\\\\nI0312 00:10:42.212537 7372 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:10:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hr49h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xwrqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.154830 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"988c0290-1e98-46c8-8253-a4718914b9ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98909d7c6bb27dac0eb9a458d9e92605b0f5a22c021884964b61a3ab8413c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs9fp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-84dfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.167993 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa5ede7f-da1a-4dc4-9ead-57fe7bba311e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49ef97db3d2caedd32eb0f2e54287be403f82804b06bda36815211a90fea821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e7f3783ac937424eca541e178b4edb6044f1d8c6fee9d592ee687256043d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.181687 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.198044 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.214863 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: E0312 00:11:00.218274 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.232377 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbda14f-f860-4f24-ab29-43678602f4e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b388a95aee74934c8aa78f3065ae94329d84916f1c225f1aa7a0ce731109c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57c1799e405da06c7fe1439b1ed663218a2f989f1d9bdbd15fd5dbcce6cd7987\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94518da2d211015f0a6f6bedf6300ec18677ff35169a6a660953478c7dfead75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53a38e9340ce3e065cf230ed2850da06e66980d9559810eb4132528f21c818d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2d0099b3517fd091d70263e537723268989dc1a05e82d3c8cb4de5b139bf4e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0d92872ae4c2325f0125a13e3561d0d88644b4a3790d9db6344f687803e7a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65d00b029c7816fef06b0c0f196242b9af0139f4091a26e41bb2c568e1a3ea14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqh6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7fbnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.244234 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-46q4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354ecab7-6a88-47ab-8645-233ac3a125a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475df41d4ce7b5b07fb48d52f66d66436ae32d7a3f359b4247c0f9b2c7ac669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bblc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-46q4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.276648 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a1b5b8e-8b92-487c-8df7-95c4b04b831f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bd6a058cb913ca64266ef6657d17c94c6d66fd75049625fc97f97465b2e543b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33102782fb3099dd211ad34a1ee5754ac8e5904dbbf7aca2591d782cee295870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d597872765da448a17ffab5bd61b22d95beb6524aa2f4dcf31f0cfaff0b6618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bed53b341948f272cfdbe46afe51bf84963e049e847fef9e7cde115934d7f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda1a920d5fa541f3d4926b52bfad78e85b1340142b8b712000e786a6acd2466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013cd814480a044cfff2b9fb0252a77804f4b5ff382397fc282a1d596257f49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5441756d53279bcd92439eaa31892d068a0c23b9fbda39a28884cac19b8c9eed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fc3fa741ea2ef9afc08b1e11858ac26ba077b5b1439349c92e85236204dd302\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.291025 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a987707e-6301-41cb-94d9-cd805b7f0eb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05965f4b800d54b93b7716756a9bda1dee223dcce1348c4316bfed16bfca7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cfac8f36c65ca474bd50aea6880335017ad2cadfd3f4f23e40b8915c1ef750\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:02Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 00:08:32.366455 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 00:08:32.369622 1 observer_polling.go:159] Starting file observer\\\\nI0312 00:08:32.411682 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 00:08:32.423693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 00:09:02.789865 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 00:09:02.789955 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b313d591964488dc11e1187fb6a17b31328efcfe337ce61ed7306339b909e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdff4f4329aa0155db3cc76e9d76500e1592262853f81ede71e5391d07f5b5f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.302306 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c0a6b45-136a-40ae-9c17-5d08ffbc73fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7785c12c9b7a092fc3ff5d8929307966c9890e1f1b577f50ef115a330d7dbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05afc014f8bfef6841d868de1e130de19b22469b9ac07013bb1519248336ecfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f17248a7c5b4217a907f04f122eeb15deef06693d907293f527f6bd1748ee48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fca600cedaefbe690245c5250e29b404ca53d99b0dac03891f81b1e878082704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fca600cedaefbe690245c5250e29b404ca53d99b0dac03891f81b1e878082704\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.315940 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45w79\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkrk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.327558 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bnt7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d7348eb-2ff6-48ab-bfa6-fe83b6b9a111\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9b029d969d85a36f88940ec8ade98f7622b1fcecd767cd237b03efc44563792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fxx2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bnt7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.340031 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hngl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ad1e98a-cb66-436d-8e5e-301724f70769\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T00:10:34Z\\\",\\\"message\\\":\\\"2026-03-12T00:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf\\\\n2026-03-12T00:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b9a3feae-317e-4392-a93f-f919ae8437bf to /host/opt/cni/bin/\\\\n2026-03-12T00:09:49Z [verbose] multus-daemon started\\\\n2026-03-12T00:09:49Z [verbose] Readiness Indicator file check\\\\n2026-03-12T00:10:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwtlf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hngl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.351379 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e379442-f878-4e5e-beba-10a7caa4107b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T00:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T00:09:31Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 00:09:31.052531 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 00:09:31.052699 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 00:09:31.053372 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3013859896/tls.crt::/tmp/serving-cert-3013859896/tls.key\\\\\\\"\\\\nI0312 00:09:31.440066 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 00:09:31.441802 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 00:09:31.441821 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 00:09:31.441850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 00:09:31.441858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 00:09:31.446337 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 00:09:31.446370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446377 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 00:09:31.446388 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 00:09:31.446392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 00:09:31.446396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 00:09:31.446400 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 00:09:31.446569 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 00:09:31.449134 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T00:09:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:08:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T00:08:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T00:08:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.362191 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bfc79910169acc1be33edffd40c30177a0b1a5d650c9e95334f8bf4a66d8768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.372418 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b159f749a1e5bbf3c49a95be3bdfbbeb0f039d46d58f1304f0f2f2c6c928c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://692b9bb39535e86e15a2345221dee2620b652e933fd84ce66c46a34b13ed05e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:00 crc kubenswrapper[4870]: I0312 00:11:00.382615 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T00:09:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89021c4dd07cc69e7db0904903064eb6c3d71d07d9565e87c5ec94b1a2880aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T00:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:00Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:01 crc kubenswrapper[4870]: I0312 00:11:01.104333 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:01 crc kubenswrapper[4870]: I0312 00:11:01.104406 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:01 crc kubenswrapper[4870]: I0312 00:11:01.104458 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:01 crc kubenswrapper[4870]: I0312 00:11:01.104597 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:01 crc kubenswrapper[4870]: E0312 00:11:01.104581 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:01 crc kubenswrapper[4870]: E0312 00:11:01.104701 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:01 crc kubenswrapper[4870]: E0312 00:11:01.104857 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:01 crc kubenswrapper[4870]: E0312 00:11:01.104894 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.037695 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.037775 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.037797 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.037826 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.037847 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:11:03Z","lastTransitionTime":"2026-03-12T00:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:11:03 crc kubenswrapper[4870]: E0312 00:11:03.058572 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:03Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.064702 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.064758 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.064781 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.064808 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.064832 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:11:03Z","lastTransitionTime":"2026-03-12T00:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:11:03 crc kubenswrapper[4870]: E0312 00:11:03.087376 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:03Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.093126 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.093239 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.093258 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.093281 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.093298 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:11:03Z","lastTransitionTime":"2026-03-12T00:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.104274 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.104308 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:03 crc kubenswrapper[4870]: E0312 00:11:03.104467 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.104573 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.104589 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:03 crc kubenswrapper[4870]: E0312 00:11:03.104682 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:03 crc kubenswrapper[4870]: E0312 00:11:03.104853 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:03 crc kubenswrapper[4870]: E0312 00:11:03.105071 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:03 crc kubenswrapper[4870]: E0312 00:11:03.114795 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:03Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.120712 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.120796 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.120818 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.120843 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.120861 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:11:03Z","lastTransitionTime":"2026-03-12T00:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:11:03 crc kubenswrapper[4870]: E0312 00:11:03.141791 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:03Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.147792 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.147845 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.147863 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.147885 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:11:03 crc kubenswrapper[4870]: I0312 00:11:03.147903 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:11:03Z","lastTransitionTime":"2026-03-12T00:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:11:03 crc kubenswrapper[4870]: E0312 00:11:03.171625 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T00:11:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44368512-0a5a-4975-893d-8f90738cf216\\\",\\\"systemUUID\\\":\\\"66422888-9fc1-4ea5-b606-e5b5b19260e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T00:11:03Z is after 2025-08-24T17:21:41Z" Mar 12 00:11:03 crc kubenswrapper[4870]: E0312 00:11:03.172020 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:11:05 crc kubenswrapper[4870]: I0312 00:11:05.104977 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:05 crc kubenswrapper[4870]: I0312 00:11:05.105094 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:05 crc kubenswrapper[4870]: I0312 00:11:05.105123 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:05 crc kubenswrapper[4870]: I0312 00:11:05.105016 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:05 crc kubenswrapper[4870]: E0312 00:11:05.105323 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:05 crc kubenswrapper[4870]: E0312 00:11:05.105527 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:05 crc kubenswrapper[4870]: E0312 00:11:05.105897 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:05 crc kubenswrapper[4870]: E0312 00:11:05.106000 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:05 crc kubenswrapper[4870]: E0312 00:11:05.220018 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:11:07 crc kubenswrapper[4870]: I0312 00:11:07.104300 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:07 crc kubenswrapper[4870]: I0312 00:11:07.104406 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:07 crc kubenswrapper[4870]: E0312 00:11:07.104459 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:07 crc kubenswrapper[4870]: E0312 00:11:07.104635 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:07 crc kubenswrapper[4870]: I0312 00:11:07.104741 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:07 crc kubenswrapper[4870]: E0312 00:11:07.104932 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:07 crc kubenswrapper[4870]: I0312 00:11:07.105235 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:07 crc kubenswrapper[4870]: E0312 00:11:07.105382 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:09 crc kubenswrapper[4870]: I0312 00:11:09.104747 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:09 crc kubenswrapper[4870]: I0312 00:11:09.104781 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:09 crc kubenswrapper[4870]: I0312 00:11:09.104836 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:09 crc kubenswrapper[4870]: I0312 00:11:09.104855 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:09 crc kubenswrapper[4870]: E0312 00:11:09.105857 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:09 crc kubenswrapper[4870]: E0312 00:11:09.106420 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:09 crc kubenswrapper[4870]: E0312 00:11:09.106560 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:09 crc kubenswrapper[4870]: E0312 00:11:09.106871 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.174373 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=80.174336235 podStartE2EDuration="1m20.174336235s" podCreationTimestamp="2026-03-12 00:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.147644386 +0000 UTC m=+160.751060766" watchObservedRunningTime="2026-03-12 00:11:10.174336235 +0000 UTC m=+160.777752585" Mar 12 00:11:10 crc kubenswrapper[4870]: E0312 00:11:10.221945 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.248837 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bnt7c" podStartSLOduration=105.248775828 podStartE2EDuration="1m45.248775828s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.247591404 +0000 UTC m=+160.851007734" watchObservedRunningTime="2026-03-12 00:11:10.248775828 +0000 UTC m=+160.852192178" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.278720 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8hngl" podStartSLOduration=105.278699759 podStartE2EDuration="1m45.278699759s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.26692947 +0000 UTC m=+160.870345830" watchObservedRunningTime="2026-03-12 00:11:10.278699759 +0000 UTC m=+160.882116069" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.314385 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=78.314357375 podStartE2EDuration="1m18.314357375s" podCreationTimestamp="2026-03-12 00:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.279014368 +0000 UTC m=+160.882430728" watchObservedRunningTime="2026-03-12 00:11:10.314357375 +0000 UTC m=+160.917773705" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.413112 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wrxrq" podStartSLOduration=104.413091558 podStartE2EDuration="1m44.413091558s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.383731052 +0000 UTC m=+160.987147372" watchObservedRunningTime="2026-03-12 00:11:10.413091558 +0000 UTC m=+161.016507888" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.429815 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podStartSLOduration=105.429792898 podStartE2EDuration="1m45.429792898s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.429111099 +0000 UTC m=+161.032527419" watchObservedRunningTime="2026-03-12 00:11:10.429792898 +0000 UTC m=+161.033209198" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.447049 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7fbnk" podStartSLOduration=105.447018484 podStartE2EDuration="1m45.447018484s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.446978053 +0000 UTC m=+161.050394363" watchObservedRunningTime="2026-03-12 00:11:10.447018484 +0000 UTC m=+161.050434794" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.461936 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-46q4m" podStartSLOduration=105.461888572 podStartE2EDuration="1m45.461888572s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.460874733 +0000 UTC m=+161.064291063" watchObservedRunningTime="2026-03-12 00:11:10.461888572 +0000 UTC m=+161.065304892" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.492670 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=74.492648138 podStartE2EDuration="1m14.492648138s" podCreationTimestamp="2026-03-12 00:09:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.492288847 +0000 UTC m=+161.095705167" watchObservedRunningTime="2026-03-12 00:11:10.492648138 +0000 UTC m=+161.096064448" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.508221 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=46.508206096 podStartE2EDuration="46.508206096s" podCreationTimestamp="2026-03-12 00:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.507237848 +0000 UTC m=+161.110654158" watchObservedRunningTime="2026-03-12 00:11:10.508206096 +0000 UTC m=+161.111622406" Mar 12 00:11:10 crc kubenswrapper[4870]: I0312 00:11:10.531965 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=21.531946039 podStartE2EDuration="21.531946039s" podCreationTimestamp="2026-03-12 00:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:10.51912205 +0000 UTC m=+161.122538360" watchObservedRunningTime="2026-03-12 00:11:10.531946039 +0000 UTC m=+161.135362349" Mar 12 00:11:11 crc kubenswrapper[4870]: I0312 00:11:11.104558 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:11 crc kubenswrapper[4870]: I0312 00:11:11.104735 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:11 crc kubenswrapper[4870]: E0312 00:11:11.104803 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:11 crc kubenswrapper[4870]: E0312 00:11:11.104935 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:11 crc kubenswrapper[4870]: I0312 00:11:11.105032 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:11 crc kubenswrapper[4870]: E0312 00:11:11.105187 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:11 crc kubenswrapper[4870]: I0312 00:11:11.105408 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:11 crc kubenswrapper[4870]: E0312 00:11:11.105699 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.104852 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.104861 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.104911 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:13 crc kubenswrapper[4870]: E0312 00:11:13.105296 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.105421 4870 scope.go:117] "RemoveContainer" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:11:13 crc kubenswrapper[4870]: E0312 00:11:13.105535 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:13 crc kubenswrapper[4870]: E0312 00:11:13.105601 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.106875 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:13 crc kubenswrapper[4870]: E0312 00:11:13.106946 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xwrqb_openshift-ovn-kubernetes(467385e2-3bbf-4cf0-909a-8e878b5d86dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" Mar 12 00:11:13 crc kubenswrapper[4870]: E0312 00:11:13.107573 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.390592 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.390639 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.390650 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.390665 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.390677 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T00:11:13Z","lastTransitionTime":"2026-03-12T00:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.453355 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn"] Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.453860 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.456605 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.457156 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.457243 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.457953 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.499427 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74aa42fd-4345-451b-845c-7a5910f1de28-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.499585 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74aa42fd-4345-451b-845c-7a5910f1de28-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.499709 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74aa42fd-4345-451b-845c-7a5910f1de28-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.499753 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74aa42fd-4345-451b-845c-7a5910f1de28-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.499829 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74aa42fd-4345-451b-845c-7a5910f1de28-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.601045 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74aa42fd-4345-451b-845c-7a5910f1de28-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.601109 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74aa42fd-4345-451b-845c-7a5910f1de28-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.601206 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74aa42fd-4345-451b-845c-7a5910f1de28-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.601242 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74aa42fd-4345-451b-845c-7a5910f1de28-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.601326 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74aa42fd-4345-451b-845c-7a5910f1de28-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.601446 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74aa42fd-4345-451b-845c-7a5910f1de28-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.601450 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74aa42fd-4345-451b-845c-7a5910f1de28-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.602550 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74aa42fd-4345-451b-845c-7a5910f1de28-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.611770 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74aa42fd-4345-451b-845c-7a5910f1de28-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.632668 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74aa42fd-4345-451b-845c-7a5910f1de28-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fkczn\" (UID: \"74aa42fd-4345-451b-845c-7a5910f1de28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.776105 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" Mar 12 00:11:13 crc kubenswrapper[4870]: W0312 00:11:13.793504 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74aa42fd_4345_451b_845c_7a5910f1de28.slice/crio-61f3f31cf24e6b0181c4b5fb1e1fd8860cc415895f03dda01bd7361a9493ab76 WatchSource:0}: Error finding container 61f3f31cf24e6b0181c4b5fb1e1fd8860cc415895f03dda01bd7361a9493ab76: Status 404 returned error can't find the container with id 61f3f31cf24e6b0181c4b5fb1e1fd8860cc415895f03dda01bd7361a9493ab76 Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.946885 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" event={"ID":"74aa42fd-4345-451b-845c-7a5910f1de28","Type":"ContainerStarted","Data":"ac9d472ae7b7c39e8ea448e6cd6140c09b49ee3b4103ebc45692ee4071249126"} Mar 12 00:11:13 crc kubenswrapper[4870]: I0312 00:11:13.947454 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" event={"ID":"74aa42fd-4345-451b-845c-7a5910f1de28","Type":"ContainerStarted","Data":"61f3f31cf24e6b0181c4b5fb1e1fd8860cc415895f03dda01bd7361a9493ab76"} Mar 12 00:11:14 crc kubenswrapper[4870]: I0312 00:11:14.145445 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 12 00:11:14 crc kubenswrapper[4870]: I0312 00:11:14.155519 4870 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 12 00:11:15 crc kubenswrapper[4870]: I0312 00:11:15.103902 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:15 crc kubenswrapper[4870]: I0312 00:11:15.103902 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:15 crc kubenswrapper[4870]: I0312 00:11:15.103957 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:15 crc kubenswrapper[4870]: E0312 00:11:15.104087 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:15 crc kubenswrapper[4870]: I0312 00:11:15.104134 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:15 crc kubenswrapper[4870]: E0312 00:11:15.104284 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:15 crc kubenswrapper[4870]: E0312 00:11:15.104527 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:15 crc kubenswrapper[4870]: E0312 00:11:15.104595 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:15 crc kubenswrapper[4870]: E0312 00:11:15.223808 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:11:17 crc kubenswrapper[4870]: I0312 00:11:17.104329 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:17 crc kubenswrapper[4870]: I0312 00:11:17.104320 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:17 crc kubenswrapper[4870]: I0312 00:11:17.104393 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:17 crc kubenswrapper[4870]: I0312 00:11:17.104710 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:17 crc kubenswrapper[4870]: E0312 00:11:17.105175 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:17 crc kubenswrapper[4870]: E0312 00:11:17.105456 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:17 crc kubenswrapper[4870]: E0312 00:11:17.105586 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:17 crc kubenswrapper[4870]: E0312 00:11:17.105783 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:19 crc kubenswrapper[4870]: I0312 00:11:19.104363 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:19 crc kubenswrapper[4870]: I0312 00:11:19.104446 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:19 crc kubenswrapper[4870]: I0312 00:11:19.104392 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:19 crc kubenswrapper[4870]: I0312 00:11:19.104392 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:19 crc kubenswrapper[4870]: E0312 00:11:19.104631 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:19 crc kubenswrapper[4870]: E0312 00:11:19.104765 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:19 crc kubenswrapper[4870]: E0312 00:11:19.104910 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:19 crc kubenswrapper[4870]: E0312 00:11:19.105000 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:20 crc kubenswrapper[4870]: E0312 00:11:20.226077 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:11:21 crc kubenswrapper[4870]: I0312 00:11:21.105178 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:21 crc kubenswrapper[4870]: I0312 00:11:21.105308 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:21 crc kubenswrapper[4870]: I0312 00:11:21.105336 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:21 crc kubenswrapper[4870]: E0312 00:11:21.106018 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:21 crc kubenswrapper[4870]: E0312 00:11:21.106100 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:21 crc kubenswrapper[4870]: E0312 00:11:21.106399 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:21 crc kubenswrapper[4870]: I0312 00:11:21.106587 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:21 crc kubenswrapper[4870]: E0312 00:11:21.106827 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:21 crc kubenswrapper[4870]: I0312 00:11:21.979304 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hngl_2ad1e98a-cb66-436d-8e5e-301724f70769/kube-multus/1.log" Mar 12 00:11:21 crc kubenswrapper[4870]: I0312 00:11:21.979846 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hngl_2ad1e98a-cb66-436d-8e5e-301724f70769/kube-multus/0.log" Mar 12 00:11:21 crc kubenswrapper[4870]: I0312 00:11:21.979883 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ad1e98a-cb66-436d-8e5e-301724f70769" containerID="c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae" exitCode=1 Mar 12 00:11:21 crc kubenswrapper[4870]: I0312 00:11:21.979909 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hngl" event={"ID":"2ad1e98a-cb66-436d-8e5e-301724f70769","Type":"ContainerDied","Data":"c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae"} Mar 12 00:11:21 crc kubenswrapper[4870]: I0312 00:11:21.979940 4870 scope.go:117] "RemoveContainer" containerID="1db9af46d329a96a9e864d1fbd042bf208a8f77300d157ea801d057374a740e2" Mar 12 00:11:21 crc kubenswrapper[4870]: I0312 00:11:21.980925 4870 scope.go:117] "RemoveContainer" containerID="c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae" Mar 12 00:11:21 crc kubenswrapper[4870]: E0312 00:11:21.981346 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8hngl_openshift-multus(2ad1e98a-cb66-436d-8e5e-301724f70769)\"" pod="openshift-multus/multus-8hngl" podUID="2ad1e98a-cb66-436d-8e5e-301724f70769" Mar 12 00:11:22 crc kubenswrapper[4870]: I0312 00:11:22.006689 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fkczn" podStartSLOduration=117.006665805 podStartE2EDuration="1m57.006665805s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:13.971417226 +0000 UTC m=+164.574833556" watchObservedRunningTime="2026-03-12 00:11:22.006665805 +0000 UTC m=+172.610082135" Mar 12 00:11:22 crc kubenswrapper[4870]: I0312 00:11:22.985094 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hngl_2ad1e98a-cb66-436d-8e5e-301724f70769/kube-multus/1.log" Mar 12 00:11:23 crc kubenswrapper[4870]: I0312 00:11:23.104436 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:23 crc kubenswrapper[4870]: I0312 00:11:23.104518 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:23 crc kubenswrapper[4870]: I0312 00:11:23.104530 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:23 crc kubenswrapper[4870]: E0312 00:11:23.104877 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:23 crc kubenswrapper[4870]: E0312 00:11:23.104678 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:23 crc kubenswrapper[4870]: E0312 00:11:23.104938 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:23 crc kubenswrapper[4870]: I0312 00:11:23.104538 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:23 crc kubenswrapper[4870]: E0312 00:11:23.105018 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:25 crc kubenswrapper[4870]: I0312 00:11:25.104506 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:25 crc kubenswrapper[4870]: I0312 00:11:25.104574 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:25 crc kubenswrapper[4870]: I0312 00:11:25.104506 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:25 crc kubenswrapper[4870]: I0312 00:11:25.104522 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:25 crc kubenswrapper[4870]: E0312 00:11:25.104624 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:25 crc kubenswrapper[4870]: E0312 00:11:25.104721 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:25 crc kubenswrapper[4870]: E0312 00:11:25.104807 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:25 crc kubenswrapper[4870]: E0312 00:11:25.105264 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:25 crc kubenswrapper[4870]: I0312 00:11:25.105958 4870 scope.go:117] "RemoveContainer" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:11:25 crc kubenswrapper[4870]: E0312 00:11:25.227420 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:11:25 crc kubenswrapper[4870]: I0312 00:11:25.979922 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xkrk6"] Mar 12 00:11:25 crc kubenswrapper[4870]: I0312 00:11:25.997738 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/3.log" Mar 12 00:11:26 crc kubenswrapper[4870]: I0312 00:11:26.000755 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:26 crc kubenswrapper[4870]: I0312 00:11:26.000806 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerStarted","Data":"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74"} Mar 12 00:11:26 crc kubenswrapper[4870]: E0312 00:11:26.000889 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:26 crc kubenswrapper[4870]: I0312 00:11:26.001402 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:11:26 crc kubenswrapper[4870]: I0312 00:11:26.032564 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podStartSLOduration=121.032548641 podStartE2EDuration="2m1.032548641s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:26.031884872 +0000 UTC m=+176.635301222" watchObservedRunningTime="2026-03-12 00:11:26.032548641 +0000 UTC m=+176.635964951" Mar 12 00:11:27 crc kubenswrapper[4870]: I0312 00:11:27.104714 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:27 crc kubenswrapper[4870]: I0312 00:11:27.104787 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:27 crc kubenswrapper[4870]: I0312 00:11:27.104745 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:27 crc kubenswrapper[4870]: I0312 00:11:27.104714 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:27 crc kubenswrapper[4870]: E0312 00:11:27.104903 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:27 crc kubenswrapper[4870]: E0312 00:11:27.104952 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:27 crc kubenswrapper[4870]: E0312 00:11:27.105030 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:27 crc kubenswrapper[4870]: E0312 00:11:27.105233 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:29 crc kubenswrapper[4870]: I0312 00:11:29.104365 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:29 crc kubenswrapper[4870]: I0312 00:11:29.104426 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:29 crc kubenswrapper[4870]: I0312 00:11:29.104453 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:29 crc kubenswrapper[4870]: E0312 00:11:29.104501 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:29 crc kubenswrapper[4870]: I0312 00:11:29.104381 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:29 crc kubenswrapper[4870]: E0312 00:11:29.104698 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:29 crc kubenswrapper[4870]: E0312 00:11:29.104725 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:29 crc kubenswrapper[4870]: E0312 00:11:29.104851 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:30 crc kubenswrapper[4870]: E0312 00:11:30.228666 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:11:31 crc kubenswrapper[4870]: I0312 00:11:31.104289 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:31 crc kubenswrapper[4870]: I0312 00:11:31.104380 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:31 crc kubenswrapper[4870]: E0312 00:11:31.104455 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:31 crc kubenswrapper[4870]: I0312 00:11:31.104302 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:31 crc kubenswrapper[4870]: E0312 00:11:31.104568 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:31 crc kubenswrapper[4870]: I0312 00:11:31.104332 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:31 crc kubenswrapper[4870]: E0312 00:11:31.104648 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:31 crc kubenswrapper[4870]: E0312 00:11:31.104685 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:33 crc kubenswrapper[4870]: I0312 00:11:33.104247 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:33 crc kubenswrapper[4870]: I0312 00:11:33.104319 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:33 crc kubenswrapper[4870]: I0312 00:11:33.104255 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:33 crc kubenswrapper[4870]: E0312 00:11:33.104381 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:33 crc kubenswrapper[4870]: I0312 00:11:33.104270 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:33 crc kubenswrapper[4870]: E0312 00:11:33.104504 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:33 crc kubenswrapper[4870]: E0312 00:11:33.104549 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:33 crc kubenswrapper[4870]: E0312 00:11:33.104636 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:35 crc kubenswrapper[4870]: I0312 00:11:35.104691 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:35 crc kubenswrapper[4870]: I0312 00:11:35.104811 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:35 crc kubenswrapper[4870]: E0312 00:11:35.104848 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:35 crc kubenswrapper[4870]: I0312 00:11:35.104958 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:35 crc kubenswrapper[4870]: I0312 00:11:35.105023 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:35 crc kubenswrapper[4870]: E0312 00:11:35.105103 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:35 crc kubenswrapper[4870]: E0312 00:11:35.105293 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:35 crc kubenswrapper[4870]: E0312 00:11:35.105626 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:35 crc kubenswrapper[4870]: I0312 00:11:35.106464 4870 scope.go:117] "RemoveContainer" containerID="c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae" Mar 12 00:11:35 crc kubenswrapper[4870]: E0312 00:11:35.230095 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 00:11:36 crc kubenswrapper[4870]: I0312 00:11:36.035343 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hngl_2ad1e98a-cb66-436d-8e5e-301724f70769/kube-multus/1.log" Mar 12 00:11:36 crc kubenswrapper[4870]: I0312 00:11:36.035420 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hngl" event={"ID":"2ad1e98a-cb66-436d-8e5e-301724f70769","Type":"ContainerStarted","Data":"52b5a384822516956958b8eb4a6f0f514a9febbe684f59ae926459ef7203c441"} Mar 12 00:11:37 crc kubenswrapper[4870]: I0312 00:11:37.104172 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:37 crc kubenswrapper[4870]: I0312 00:11:37.104209 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:37 crc kubenswrapper[4870]: I0312 00:11:37.104285 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:37 crc kubenswrapper[4870]: E0312 00:11:37.104336 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:37 crc kubenswrapper[4870]: I0312 00:11:37.104411 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:37 crc kubenswrapper[4870]: E0312 00:11:37.104599 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:37 crc kubenswrapper[4870]: E0312 00:11:37.104974 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:37 crc kubenswrapper[4870]: E0312 00:11:37.105325 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:39 crc kubenswrapper[4870]: I0312 00:11:39.104552 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:39 crc kubenswrapper[4870]: E0312 00:11:39.104782 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 00:11:39 crc kubenswrapper[4870]: I0312 00:11:39.104838 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:39 crc kubenswrapper[4870]: I0312 00:11:39.104911 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:39 crc kubenswrapper[4870]: I0312 00:11:39.104839 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:39 crc kubenswrapper[4870]: E0312 00:11:39.105046 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 00:11:39 crc kubenswrapper[4870]: E0312 00:11:39.105224 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkrk6" podUID="5c62c8d9-0f6b-4ec4-af08-fae75fb41288" Mar 12 00:11:39 crc kubenswrapper[4870]: E0312 00:11:39.105328 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 00:11:41 crc kubenswrapper[4870]: I0312 00:11:41.104673 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:41 crc kubenswrapper[4870]: I0312 00:11:41.104724 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:41 crc kubenswrapper[4870]: I0312 00:11:41.104722 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:41 crc kubenswrapper[4870]: I0312 00:11:41.104672 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:41 crc kubenswrapper[4870]: I0312 00:11:41.109201 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 00:11:41 crc kubenswrapper[4870]: I0312 00:11:41.109307 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 12 00:11:41 crc kubenswrapper[4870]: I0312 00:11:41.109574 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 00:11:41 crc kubenswrapper[4870]: I0312 00:11:41.109627 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 00:11:41 crc kubenswrapper[4870]: I0312 00:11:41.109812 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 12 00:11:41 crc kubenswrapper[4870]: I0312 00:11:41.110241 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.155359 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.197300 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kwb4k"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.198034 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.198389 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lspxp"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.199282 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.203053 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.203413 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kvh84"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.203819 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.204118 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.208495 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.208562 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.208680 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.208614 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.208886 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.209186 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.209912 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.210491 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.210632 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.210816 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.211124 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.211505 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.211607 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.211680 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.211126 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29554560-f9x27"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.211751 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.211682 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.212148 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29554560-f9x27" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.213680 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.213859 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.214638 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.214651 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.215564 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.215782 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7gfg9"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.216243 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.245821 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-h8r8v"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.251818 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.252113 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.270469 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.271353 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.271948 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.272480 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.272763 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-h8r8v" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.275678 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.275814 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.275890 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.277281 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8jsrj"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.277681 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.277943 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.278231 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.278434 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.278672 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.278807 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.283969 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.284388 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.284431 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.284554 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.284676 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.284817 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.284858 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-vbgrg"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.284926 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.285025 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.285128 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.285242 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.285347 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.285476 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.285583 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.285593 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.285480 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.286038 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.286811 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.287299 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.288340 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.288515 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.288639 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.288738 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.288906 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.288922 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.289021 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.289047 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.289072 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.289198 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.289235 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.289622 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.289712 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.289858 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.289202 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.291831 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.292369 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dcrsg"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.292662 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.292813 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.292967 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.293298 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.295238 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.297770 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.297960 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298554 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298611 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298636 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a67d7087-6ab3-42ff-b5cc-cba186b5b036-etcd-client\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298659 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78dhb\" (UniqueName: \"kubernetes.io/projected/368b0e75-e87a-43f9-9369-588871bf28be-kube-api-access-78dhb\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298678 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80a24bf-734e-476e-9559-4b1bc913802a-serving-cert\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298695 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e80a24bf-734e-476e-9559-4b1bc913802a-audit-dir\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298711 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298727 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d6a8bb4-df10-46c3-91e6-826e501be09f-serviceca\") pod \"image-pruner-29554560-f9x27\" (UID: \"6d6a8bb4-df10-46c3-91e6-826e501be09f\") " pod="openshift-image-registry/image-pruner-29554560-f9x27" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298741 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-client-ca\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298755 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a67d7087-6ab3-42ff-b5cc-cba186b5b036-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298769 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06725c1-5841-4d3b-ae47-c78a608229e0-config\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298784 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298800 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298817 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298846 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlz89\" (UniqueName: \"kubernetes.io/projected/dae6a345-cb5d-4553-868f-232fc4ec81af-kube-api-access-xlz89\") pod \"downloads-7954f5f757-h8r8v\" (UID: \"dae6a345-cb5d-4553-868f-232fc4ec81af\") " pod="openshift-console/downloads-7954f5f757-h8r8v" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298862 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-client-ca\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298879 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a67d7087-6ab3-42ff-b5cc-cba186b5b036-audit-dir\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298894 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a06725c1-5841-4d3b-ae47-c78a608229e0-machine-approver-tls\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298896 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298987 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.298910 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e80a24bf-734e-476e-9559-4b1bc913802a-node-pullsecrets\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299039 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-audit\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299055 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299080 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-config\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299100 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-config\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299121 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299137 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-audit-policies\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299158 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-etcd-serving-ca\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299187 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kcw9\" (UniqueName: \"kubernetes.io/projected/e80a24bf-734e-476e-9559-4b1bc913802a-kube-api-access-8kcw9\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299203 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/368b0e75-e87a-43f9-9369-588871bf28be-audit-dir\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299217 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67d7087-6ab3-42ff-b5cc-cba186b5b036-serving-cert\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299231 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rmcv\" (UniqueName: \"kubernetes.io/projected/a67d7087-6ab3-42ff-b5cc-cba186b5b036-kube-api-access-6rmcv\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299248 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299262 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-config\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299284 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ae78cd-522f-4125-b2d9-84c52dbeadcb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cz27n\" (UID: \"14ae78cd-522f-4125-b2d9-84c52dbeadcb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299298 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299312 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299333 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a38e148-742e-4f33-a30c-7289fad54acb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wzkc5\" (UID: \"6a38e148-742e-4f33-a30c-7289fad54acb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299345 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299349 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxc2q\" (UniqueName: \"kubernetes.io/projected/6a38e148-742e-4f33-a30c-7289fad54acb-kube-api-access-nxc2q\") pod \"cluster-samples-operator-665b6dd947-wzkc5\" (UID: \"6a38e148-742e-4f33-a30c-7289fad54acb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299507 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299538 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a67d7087-6ab3-42ff-b5cc-cba186b5b036-audit-policies\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299561 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67d7087-6ab3-42ff-b5cc-cba186b5b036-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299584 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ae78cd-522f-4125-b2d9-84c52dbeadcb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cz27n\" (UID: \"14ae78cd-522f-4125-b2d9-84c52dbeadcb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299602 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8041594-4bbd-408a-b59d-26bb0e17a95e-serving-cert\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299622 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62kkf\" (UniqueName: \"kubernetes.io/projected/6d6a8bb4-df10-46c3-91e6-826e501be09f-kube-api-access-62kkf\") pod \"image-pruner-29554560-f9x27\" (UID: \"6d6a8bb4-df10-46c3-91e6-826e501be09f\") " pod="openshift-image-registry/image-pruner-29554560-f9x27" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299648 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3b68206-2dd1-410e-930d-a97b21caddc9-serving-cert\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299666 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-config\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299688 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh92z\" (UniqueName: \"kubernetes.io/projected/d8041594-4bbd-408a-b59d-26bb0e17a95e-kube-api-access-sh92z\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299710 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-images\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299732 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a06725c1-5841-4d3b-ae47-c78a608229e0-auth-proxy-config\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299759 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e80a24bf-734e-476e-9559-4b1bc913802a-etcd-client\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299781 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-image-import-ca\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299804 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e80a24bf-734e-476e-9559-4b1bc913802a-encryption-config\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299823 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v2c2\" (UniqueName: \"kubernetes.io/projected/a06725c1-5841-4d3b-ae47-c78a608229e0-kube-api-access-2v2c2\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299844 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299978 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc6kl\" (UniqueName: \"kubernetes.io/projected/c3b68206-2dd1-410e-930d-a97b21caddc9-kube-api-access-vc6kl\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.300003 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.300055 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf962\" (UniqueName: \"kubernetes.io/projected/14ae78cd-522f-4125-b2d9-84c52dbeadcb-kube-api-access-jf962\") pod \"openshift-apiserver-operator-796bbdcf4f-cz27n\" (UID: \"14ae78cd-522f-4125-b2d9-84c52dbeadcb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.300076 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zmhz\" (UniqueName: \"kubernetes.io/projected/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-kube-api-access-2zmhz\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.300095 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a67d7087-6ab3-42ff-b5cc-cba186b5b036-encryption-config\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299419 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299463 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.299514 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.300958 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.301059 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5l64f"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.302806 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.303417 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.303616 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.303926 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.319836 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.321400 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.322163 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.323963 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.324509 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.324600 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.324776 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.325019 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.325725 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5l4xj"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.329300 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.329505 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.329603 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.329740 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.329747 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.348373 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.348575 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.350647 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.350779 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.350884 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.351285 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.351366 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.351406 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.351452 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.351544 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.351645 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.351963 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.352053 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.352654 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.352681 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29554560-f9x27"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.352691 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.352753 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.353098 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.355109 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.355616 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-647f6"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.355955 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.356138 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.356651 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c88kv"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.357225 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.358057 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.359547 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.360333 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.360448 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.360743 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.360891 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.361503 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.362071 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.362439 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.365693 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.366875 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.367997 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.368001 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.369611 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.371460 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.372283 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.372459 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.372941 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.374647 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.375277 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bbcc4"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.375432 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.375653 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.376719 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.381240 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.381422 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6znk2"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.381668 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.382389 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.384603 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gptpv"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.385812 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.386697 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.386756 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.387056 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554570-l4btp"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.391082 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-clrh6"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.391237 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554570-l4btp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.392441 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.392657 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.393329 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.394616 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.395497 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.396057 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.398040 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.398472 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.398945 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.400750 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.401541 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.401955 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.403284 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.403656 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0498d476-ade9-410a-93e3-5f344bccd8ba-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6qxr\" (UID: \"0498d476-ade9-410a-93e3-5f344bccd8ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.403701 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggv9\" (UniqueName: \"kubernetes.io/projected/2554ffbb-61ab-48bc-bc78-05ae8517f40d-kube-api-access-lggv9\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.403749 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh92z\" (UniqueName: \"kubernetes.io/projected/d8041594-4bbd-408a-b59d-26bb0e17a95e-kube-api-access-sh92z\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.403787 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-images\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.403946 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a06725c1-5841-4d3b-ae47-c78a608229e0-auth-proxy-config\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.404115 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdpxh\" (UniqueName: \"kubernetes.io/projected/a0cb447f-792e-4438-b3f2-32bd2f408f03-kube-api-access-bdpxh\") pod \"kube-storage-version-migrator-operator-b67b599dd-ghs7z\" (UID: \"a0cb447f-792e-4438-b3f2-32bd2f408f03\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.404179 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12-signing-key\") pod \"service-ca-9c57cc56f-bbcc4\" (UID: \"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.404216 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e80a24bf-734e-476e-9559-4b1bc913802a-etcd-client\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.404924 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0cb447f-792e-4438-b3f2-32bd2f408f03-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ghs7z\" (UID: \"a0cb447f-792e-4438-b3f2-32bd2f408f03\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.404981 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-etcd-ca\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.405018 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.405064 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/93b8bee4-6475-4c86-abeb-92e555f4e1eb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-w8l5k\" (UID: \"93b8bee4-6475-4c86-abeb-92e555f4e1eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.405097 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0498d476-ade9-410a-93e3-5f344bccd8ba-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6qxr\" (UID: \"0498d476-ade9-410a-93e3-5f344bccd8ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.405133 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-image-import-ca\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.405145 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-images\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.405022 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a06725c1-5841-4d3b-ae47-c78a608229e0-auth-proxy-config\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.405932 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kwb4k"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.407334 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-image-import-ca\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.408279 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.409861 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e80a24bf-734e-476e-9559-4b1bc913802a-encryption-config\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.409925 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v2c2\" (UniqueName: \"kubernetes.io/projected/a06725c1-5841-4d3b-ae47-c78a608229e0-kube-api-access-2v2c2\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.409959 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc6kl\" (UniqueName: \"kubernetes.io/projected/c3b68206-2dd1-410e-930d-a97b21caddc9-kube-api-access-vc6kl\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.409989 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.409986 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.410023 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2554ffbb-61ab-48bc-bc78-05ae8517f40d-trusted-ca\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.410051 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0465219-4339-46be-90ab-0e4519f19493-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k84g8\" (UID: \"c0465219-4339-46be-90ab-0e4519f19493\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.410095 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf962\" (UniqueName: \"kubernetes.io/projected/14ae78cd-522f-4125-b2d9-84c52dbeadcb-kube-api-access-jf962\") pod \"openshift-apiserver-operator-796bbdcf4f-cz27n\" (UID: \"14ae78cd-522f-4125-b2d9-84c52dbeadcb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.410562 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zmhz\" (UniqueName: \"kubernetes.io/projected/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-kube-api-access-2zmhz\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.410681 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a67d7087-6ab3-42ff-b5cc-cba186b5b036-encryption-config\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.410815 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a67d7087-6ab3-42ff-b5cc-cba186b5b036-etcd-client\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.410863 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.410892 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.411903 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lspxp"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.412681 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd66140c-ae6e-461a-84a9-597fdb115dd8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.412816 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd66140c-ae6e-461a-84a9-597fdb115dd8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413129 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-oauth-serving-cert\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413214 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78dhb\" (UniqueName: \"kubernetes.io/projected/368b0e75-e87a-43f9-9369-588871bf28be-kube-api-access-78dhb\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413276 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93b8bee4-6475-4c86-abeb-92e555f4e1eb-serving-cert\") pod \"openshift-config-operator-7777fb866f-w8l5k\" (UID: \"93b8bee4-6475-4c86-abeb-92e555f4e1eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413307 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bfd7abe7-1271-40a0-b011-eb4841fb3c03-images\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413371 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd5qv\" (UniqueName: \"kubernetes.io/projected/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-kube-api-access-sd5qv\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413410 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80a24bf-734e-476e-9559-4b1bc913802a-serving-cert\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413439 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e80a24bf-734e-476e-9559-4b1bc913802a-audit-dir\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413467 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413498 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12-signing-cabundle\") pod \"service-ca-9c57cc56f-bbcc4\" (UID: \"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413530 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d6a8bb4-df10-46c3-91e6-826e501be09f-serviceca\") pod \"image-pruner-29554560-f9x27\" (UID: \"6d6a8bb4-df10-46c3-91e6-826e501be09f\") " pod="openshift-image-registry/image-pruner-29554560-f9x27" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413563 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413593 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413623 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0cb447f-792e-4438-b3f2-32bd2f408f03-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ghs7z\" (UID: \"a0cb447f-792e-4438-b3f2-32bd2f408f03\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413648 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-client-ca\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413674 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a67d7087-6ab3-42ff-b5cc-cba186b5b036-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413702 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06725c1-5841-4d3b-ae47-c78a608229e0-config\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413731 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413760 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4g7j\" (UniqueName: \"kubernetes.io/projected/464113cb-0982-46cb-91e1-95fc6e7a9f83-kube-api-access-r4g7j\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413793 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0465219-4339-46be-90ab-0e4519f19493-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k84g8\" (UID: \"c0465219-4339-46be-90ab-0e4519f19493\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413836 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-trusted-ca-bundle\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413863 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlz89\" (UniqueName: \"kubernetes.io/projected/dae6a345-cb5d-4553-868f-232fc4ec81af-kube-api-access-xlz89\") pod \"downloads-7954f5f757-h8r8v\" (UID: \"dae6a345-cb5d-4553-868f-232fc4ec81af\") " pod="openshift-console/downloads-7954f5f757-h8r8v" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413891 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a06725c1-5841-4d3b-ae47-c78a608229e0-machine-approver-tls\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413918 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/464113cb-0982-46cb-91e1-95fc6e7a9f83-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413920 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k5ks6"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413943 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-config\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413972 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-client-ca\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.413998 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a67d7087-6ab3-42ff-b5cc-cba186b5b036-audit-dir\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414026 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e80a24bf-734e-476e-9559-4b1bc913802a-node-pullsecrets\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414053 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-audit\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414088 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414116 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2554ffbb-61ab-48bc-bc78-05ae8517f40d-serving-cert\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414151 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-console-config\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414197 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-etcd-service-ca\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414222 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0465219-4339-46be-90ab-0e4519f19493-config\") pod \"kube-controller-manager-operator-78b949d7b-k84g8\" (UID: \"c0465219-4339-46be-90ab-0e4519f19493\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414272 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414299 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-audit-policies\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414326 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9qkf\" (UniqueName: \"kubernetes.io/projected/ee267da0-c1eb-4a5d-80d5-da65c77a7c23-kube-api-access-x9qkf\") pod \"migrator-59844c95c7-7ccgj\" (UID: \"ee267da0-c1eb-4a5d-80d5-da65c77a7c23\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414354 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-config\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414383 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-config\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414414 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfd7abe7-1271-40a0-b011-eb4841fb3c03-proxy-tls\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.414975 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k5ks6" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.415436 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e80a24bf-734e-476e-9559-4b1bc913802a-encryption-config\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.415839 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.416187 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d6a8bb4-df10-46c3-91e6-826e501be09f-serviceca\") pod \"image-pruner-29554560-f9x27\" (UID: \"6d6a8bb4-df10-46c3-91e6-826e501be09f\") " pod="openshift-image-registry/image-pruner-29554560-f9x27" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.416674 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7gfg9"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.416828 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a67d7087-6ab3-42ff-b5cc-cba186b5b036-audit-dir\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.416870 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e80a24bf-734e-476e-9559-4b1bc913802a-node-pullsecrets\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.412688 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.417532 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-audit\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.417548 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-client-ca\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.417560 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8kqp\" (UniqueName: \"kubernetes.io/projected/bfd7abe7-1271-40a0-b011-eb4841fb3c03-kube-api-access-d8kqp\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418233 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/464113cb-0982-46cb-91e1-95fc6e7a9f83-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418274 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxbxg\" (UniqueName: \"kubernetes.io/projected/0498d476-ade9-410a-93e3-5f344bccd8ba-kube-api-access-xxbxg\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6qxr\" (UID: \"0498d476-ade9-410a-93e3-5f344bccd8ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418295 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06725c1-5841-4d3b-ae47-c78a608229e0-config\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418310 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-etcd-serving-ca\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418347 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kcw9\" (UniqueName: \"kubernetes.io/projected/e80a24bf-734e-476e-9559-4b1bc913802a-kube-api-access-8kcw9\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418365 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-config\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418385 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67d7087-6ab3-42ff-b5cc-cba186b5b036-serving-cert\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.417702 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418419 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/368b0e75-e87a-43f9-9369-588871bf28be-audit-dir\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418454 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/368b0e75-e87a-43f9-9369-588871bf28be-audit-dir\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418460 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2554ffbb-61ab-48bc-bc78-05ae8517f40d-config\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418237 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a67d7087-6ab3-42ff-b5cc-cba186b5b036-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418502 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rmcv\" (UniqueName: \"kubernetes.io/projected/a67d7087-6ab3-42ff-b5cc-cba186b5b036-kube-api-access-6rmcv\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418525 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-service-ca\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418543 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418563 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-config\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418601 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ae78cd-522f-4125-b2d9-84c52dbeadcb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cz27n\" (UID: \"14ae78cd-522f-4125-b2d9-84c52dbeadcb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418625 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418645 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418667 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a38e148-742e-4f33-a30c-7289fad54acb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wzkc5\" (UID: \"6a38e148-742e-4f33-a30c-7289fad54acb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418691 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s48n\" (UniqueName: \"kubernetes.io/projected/ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12-kube-api-access-6s48n\") pod \"service-ca-9c57cc56f-bbcc4\" (UID: \"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418714 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxc2q\" (UniqueName: \"kubernetes.io/projected/6a38e148-742e-4f33-a30c-7289fad54acb-kube-api-access-nxc2q\") pod \"cluster-samples-operator-665b6dd947-wzkc5\" (UID: \"6a38e148-742e-4f33-a30c-7289fad54acb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418736 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67d7087-6ab3-42ff-b5cc-cba186b5b036-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418757 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-serving-cert\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418779 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd66140c-ae6e-461a-84a9-597fdb115dd8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418799 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418821 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a67d7087-6ab3-42ff-b5cc-cba186b5b036-audit-policies\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418827 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-client-ca\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418845 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-etcd-client\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418897 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ae78cd-522f-4125-b2d9-84c52dbeadcb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cz27n\" (UID: \"14ae78cd-522f-4125-b2d9-84c52dbeadcb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418919 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8041594-4bbd-408a-b59d-26bb0e17a95e-serving-cert\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418935 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62kkf\" (UniqueName: \"kubernetes.io/projected/6d6a8bb4-df10-46c3-91e6-826e501be09f-kube-api-access-62kkf\") pod \"image-pruner-29554560-f9x27\" (UID: \"6d6a8bb4-df10-46c3-91e6-826e501be09f\") " pod="openshift-image-registry/image-pruner-29554560-f9x27" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418947 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e80a24bf-734e-476e-9559-4b1bc913802a-audit-dir\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418957 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfd7abe7-1271-40a0-b011-eb4841fb3c03-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.418981 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/464113cb-0982-46cb-91e1-95fc6e7a9f83-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.419001 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8d26541a-27be-4bb8-99f2-43f63e4729a2-console-oauth-config\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.419024 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-config\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.419043 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3b68206-2dd1-410e-930d-a97b21caddc9-serving-cert\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.419066 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zr9\" (UniqueName: \"kubernetes.io/projected/93b8bee4-6475-4c86-abeb-92e555f4e1eb-kube-api-access-g8zr9\") pod \"openshift-config-operator-7777fb866f-w8l5k\" (UID: \"93b8bee4-6475-4c86-abeb-92e555f4e1eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.419086 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d26541a-27be-4bb8-99f2-43f63e4729a2-console-serving-cert\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.419085 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-config\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.419104 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5hrl\" (UniqueName: \"kubernetes.io/projected/8d26541a-27be-4bb8-99f2-43f63e4729a2-kube-api-access-z5hrl\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.417755 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.419178 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-etcd-serving-ca\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.420169 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-audit-policies\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.420556 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.420763 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80a24bf-734e-476e-9559-4b1bc913802a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.421143 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.421286 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-config\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.421325 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dcrsg"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.421713 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a67d7087-6ab3-42ff-b5cc-cba186b5b036-audit-policies\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.421745 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67d7087-6ab3-42ff-b5cc-cba186b5b036-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.421782 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ae78cd-522f-4125-b2d9-84c52dbeadcb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cz27n\" (UID: \"14ae78cd-522f-4125-b2d9-84c52dbeadcb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.422663 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a67d7087-6ab3-42ff-b5cc-cba186b5b036-encryption-config\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.422692 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-config\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.422965 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a67d7087-6ab3-42ff-b5cc-cba186b5b036-etcd-client\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.423271 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.423315 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.423686 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a06725c1-5841-4d3b-ae47-c78a608229e0-machine-approver-tls\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.423805 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e80a24bf-734e-476e-9559-4b1bc913802a-etcd-client\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.424766 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.424784 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80a24bf-734e-476e-9559-4b1bc913802a-serving-cert\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.425938 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3b68206-2dd1-410e-930d-a97b21caddc9-serving-cert\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.425988 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67d7087-6ab3-42ff-b5cc-cba186b5b036-serving-cert\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.426398 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8041594-4bbd-408a-b59d-26bb0e17a95e-serving-cert\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.426410 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.427383 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.429516 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vbgrg"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.438701 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.441415 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.442816 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a38e148-742e-4f33-a30c-7289fad54acb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wzkc5\" (UID: \"6a38e148-742e-4f33-a30c-7289fad54acb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.442980 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ae78cd-522f-4125-b2d9-84c52dbeadcb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cz27n\" (UID: \"14ae78cd-522f-4125-b2d9-84c52dbeadcb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.443137 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.443322 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8jsrj"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.443716 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.445725 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kvh84"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.446954 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6znk2"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.448042 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5l4xj"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.449329 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.450455 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.451418 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.452418 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5l64f"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.453413 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-h8r8v"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.455422 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.456489 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.457444 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.458572 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.459594 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bbcc4"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.460084 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.460671 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.462238 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.463205 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.464695 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.466187 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-z5bcc"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.467050 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.467233 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nkq8z"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.468444 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.468771 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.470242 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.471235 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.472510 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554570-l4btp"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.473812 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-clrh6"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.475068 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.476213 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-z5bcc"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.477588 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gptpv"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.478399 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nkq8z"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.479515 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c88kv"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.479859 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.480944 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k5ks6"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.481704 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.482707 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7vhpt"] Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.483431 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.500215 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.519776 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0cb447f-792e-4438-b3f2-32bd2f408f03-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ghs7z\" (UID: \"a0cb447f-792e-4438-b3f2-32bd2f408f03\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.519832 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4g7j\" (UniqueName: \"kubernetes.io/projected/464113cb-0982-46cb-91e1-95fc6e7a9f83-kube-api-access-r4g7j\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.519856 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0465219-4339-46be-90ab-0e4519f19493-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k84g8\" (UID: \"c0465219-4339-46be-90ab-0e4519f19493\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.519881 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-trusted-ca-bundle\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.519927 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/464113cb-0982-46cb-91e1-95fc6e7a9f83-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.519945 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-config\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.519962 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-console-config\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.519998 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2554ffbb-61ab-48bc-bc78-05ae8517f40d-serving-cert\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520015 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0465219-4339-46be-90ab-0e4519f19493-config\") pod \"kube-controller-manager-operator-78b949d7b-k84g8\" (UID: \"c0465219-4339-46be-90ab-0e4519f19493\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520038 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-etcd-service-ca\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520056 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9qkf\" (UniqueName: \"kubernetes.io/projected/ee267da0-c1eb-4a5d-80d5-da65c77a7c23-kube-api-access-x9qkf\") pod \"migrator-59844c95c7-7ccgj\" (UID: \"ee267da0-c1eb-4a5d-80d5-da65c77a7c23\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520092 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfd7abe7-1271-40a0-b011-eb4841fb3c03-proxy-tls\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520108 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8kqp\" (UniqueName: \"kubernetes.io/projected/bfd7abe7-1271-40a0-b011-eb4841fb3c03-kube-api-access-d8kqp\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520124 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/464113cb-0982-46cb-91e1-95fc6e7a9f83-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520140 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxbxg\" (UniqueName: \"kubernetes.io/projected/0498d476-ade9-410a-93e3-5f344bccd8ba-kube-api-access-xxbxg\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6qxr\" (UID: \"0498d476-ade9-410a-93e3-5f344bccd8ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520191 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2554ffbb-61ab-48bc-bc78-05ae8517f40d-config\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520212 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-service-ca\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520261 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s48n\" (UniqueName: \"kubernetes.io/projected/ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12-kube-api-access-6s48n\") pod \"service-ca-9c57cc56f-bbcc4\" (UID: \"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520288 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-serving-cert\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520305 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd66140c-ae6e-461a-84a9-597fdb115dd8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520348 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfd7abe7-1271-40a0-b011-eb4841fb3c03-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520364 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-etcd-client\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520379 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/464113cb-0982-46cb-91e1-95fc6e7a9f83-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520393 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8d26541a-27be-4bb8-99f2-43f63e4729a2-console-oauth-config\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520432 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zr9\" (UniqueName: \"kubernetes.io/projected/93b8bee4-6475-4c86-abeb-92e555f4e1eb-kube-api-access-g8zr9\") pod \"openshift-config-operator-7777fb866f-w8l5k\" (UID: \"93b8bee4-6475-4c86-abeb-92e555f4e1eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520448 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d26541a-27be-4bb8-99f2-43f63e4729a2-console-serving-cert\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520463 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5hrl\" (UniqueName: \"kubernetes.io/projected/8d26541a-27be-4bb8-99f2-43f63e4729a2-kube-api-access-z5hrl\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520479 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lggv9\" (UniqueName: \"kubernetes.io/projected/2554ffbb-61ab-48bc-bc78-05ae8517f40d-kube-api-access-lggv9\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520522 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdpxh\" (UniqueName: \"kubernetes.io/projected/a0cb447f-792e-4438-b3f2-32bd2f408f03-kube-api-access-bdpxh\") pod \"kube-storage-version-migrator-operator-b67b599dd-ghs7z\" (UID: \"a0cb447f-792e-4438-b3f2-32bd2f408f03\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520538 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0498d476-ade9-410a-93e3-5f344bccd8ba-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6qxr\" (UID: \"0498d476-ade9-410a-93e3-5f344bccd8ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520553 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12-signing-key\") pod \"service-ca-9c57cc56f-bbcc4\" (UID: \"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520592 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0cb447f-792e-4438-b3f2-32bd2f408f03-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ghs7z\" (UID: \"a0cb447f-792e-4438-b3f2-32bd2f408f03\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520610 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-etcd-ca\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520624 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/93b8bee4-6475-4c86-abeb-92e555f4e1eb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-w8l5k\" (UID: \"93b8bee4-6475-4c86-abeb-92e555f4e1eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520641 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0498d476-ade9-410a-93e3-5f344bccd8ba-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6qxr\" (UID: \"0498d476-ade9-410a-93e3-5f344bccd8ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520700 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2554ffbb-61ab-48bc-bc78-05ae8517f40d-trusted-ca\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520715 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0465219-4339-46be-90ab-0e4519f19493-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k84g8\" (UID: \"c0465219-4339-46be-90ab-0e4519f19493\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520775 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd66140c-ae6e-461a-84a9-597fdb115dd8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520790 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-oauth-serving-cert\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520812 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd66140c-ae6e-461a-84a9-597fdb115dd8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520847 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bfd7abe7-1271-40a0-b011-eb4841fb3c03-images\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520865 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd5qv\" (UniqueName: \"kubernetes.io/projected/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-kube-api-access-sd5qv\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520871 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-etcd-service-ca\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520886 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-console-config\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520880 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93b8bee4-6475-4c86-abeb-92e555f4e1eb-serving-cert\") pod \"openshift-config-operator-7777fb866f-w8l5k\" (UID: \"93b8bee4-6475-4c86-abeb-92e555f4e1eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.520946 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12-signing-cabundle\") pod \"service-ca-9c57cc56f-bbcc4\" (UID: \"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.521438 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-trusted-ca-bundle\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.521549 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/93b8bee4-6475-4c86-abeb-92e555f4e1eb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-w8l5k\" (UID: \"93b8bee4-6475-4c86-abeb-92e555f4e1eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.522233 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/464113cb-0982-46cb-91e1-95fc6e7a9f83-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.522321 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-etcd-ca\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.522579 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0498d476-ade9-410a-93e3-5f344bccd8ba-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6qxr\" (UID: \"0498d476-ade9-410a-93e3-5f344bccd8ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.522878 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-service-ca\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.523769 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8d26541a-27be-4bb8-99f2-43f63e4729a2-oauth-serving-cert\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.524033 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93b8bee4-6475-4c86-abeb-92e555f4e1eb-serving-cert\") pod \"openshift-config-operator-7777fb866f-w8l5k\" (UID: \"93b8bee4-6475-4c86-abeb-92e555f4e1eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.524407 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-config\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.525331 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfd7abe7-1271-40a0-b011-eb4841fb3c03-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.527003 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/464113cb-0982-46cb-91e1-95fc6e7a9f83-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.528329 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8d26541a-27be-4bb8-99f2-43f63e4729a2-console-oauth-config\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.528754 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-etcd-client\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.529359 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0498d476-ade9-410a-93e3-5f344bccd8ba-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6qxr\" (UID: \"0498d476-ade9-410a-93e3-5f344bccd8ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.536741 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d26541a-27be-4bb8-99f2-43f63e4729a2-console-serving-cert\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.537495 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-serving-cert\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.541110 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.560221 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.591504 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.600972 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.605232 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2554ffbb-61ab-48bc-bc78-05ae8517f40d-serving-cert\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.637092 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.643059 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2554ffbb-61ab-48bc-bc78-05ae8517f40d-config\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.650047 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.654643 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2554ffbb-61ab-48bc-bc78-05ae8517f40d-trusted-ca\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.661687 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.680070 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.700879 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.720965 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.741633 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.760471 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.781262 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.801188 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.821046 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.841230 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.861334 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.880808 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.901089 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.929478 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.941242 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.960654 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.980102 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 12 00:11:44 crc kubenswrapper[4870]: I0312 00:11:44.999855 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.004908 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0cb447f-792e-4438-b3f2-32bd2f408f03-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ghs7z\" (UID: \"a0cb447f-792e-4438-b3f2-32bd2f408f03\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.020340 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.021693 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0cb447f-792e-4438-b3f2-32bd2f408f03-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ghs7z\" (UID: \"a0cb447f-792e-4438-b3f2-32bd2f408f03\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.042130 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.061626 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.080650 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.082600 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0465219-4339-46be-90ab-0e4519f19493-config\") pod \"kube-controller-manager-operator-78b949d7b-k84g8\" (UID: \"c0465219-4339-46be-90ab-0e4519f19493\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.102103 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.122649 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.137859 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0465219-4339-46be-90ab-0e4519f19493-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k84g8\" (UID: \"c0465219-4339-46be-90ab-0e4519f19493\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.141601 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.162106 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.182648 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.201506 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.205256 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bfd7abe7-1271-40a0-b011-eb4841fb3c03-images\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.221272 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.237583 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfd7abe7-1271-40a0-b011-eb4841fb3c03-proxy-tls\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.242588 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.260751 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.281810 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.300868 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.320433 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.340710 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.380828 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.399397 4870 request.go:700] Waited for 1.019296301s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/secrets?fieldSelector=metadata.name%3Dsigning-key&limit=500&resourceVersion=0 Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.402694 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.416521 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12-signing-key\") pod \"service-ca-9c57cc56f-bbcc4\" (UID: \"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.422490 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.441663 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.452711 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12-signing-cabundle\") pod \"service-ca-9c57cc56f-bbcc4\" (UID: \"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.460977 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.481582 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.501955 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.520998 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 00:11:45 crc kubenswrapper[4870]: E0312 00:11:45.521928 4870 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 00:11:45 crc kubenswrapper[4870]: E0312 00:11:45.522020 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd66140c-ae6e-461a-84a9-597fdb115dd8-serving-cert podName:fd66140c-ae6e-461a-84a9-597fdb115dd8 nodeName:}" failed. No retries permitted until 2026-03-12 00:11:46.021992241 +0000 UTC m=+196.625408581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fd66140c-ae6e-461a-84a9-597fdb115dd8-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" (UID: "fd66140c-ae6e-461a-84a9-597fdb115dd8") : failed to sync secret cache: timed out waiting for the condition Mar 12 00:11:45 crc kubenswrapper[4870]: E0312 00:11:45.522071 4870 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 12 00:11:45 crc kubenswrapper[4870]: E0312 00:11:45.522208 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fd66140c-ae6e-461a-84a9-597fdb115dd8-config podName:fd66140c-ae6e-461a-84a9-597fdb115dd8 nodeName:}" failed. No retries permitted until 2026-03-12 00:11:46.022178976 +0000 UTC m=+196.625595326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/fd66140c-ae6e-461a-84a9-597fdb115dd8-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" (UID: "fd66140c-ae6e-461a-84a9-597fdb115dd8") : failed to sync configmap cache: timed out waiting for the condition Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.540582 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.561655 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.582259 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.602361 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.622504 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.640798 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.661477 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.689525 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.701801 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.720910 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.741453 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.760584 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.780456 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.801529 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.820680 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.840396 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.860540 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.881442 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.901864 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.921329 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.940784 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.961127 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 00:11:45 crc kubenswrapper[4870]: I0312 00:11:45.981765 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.001873 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.021252 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.041089 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.047869 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd66140c-ae6e-461a-84a9-597fdb115dd8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.048189 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd66140c-ae6e-461a-84a9-597fdb115dd8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.049395 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd66140c-ae6e-461a-84a9-597fdb115dd8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.053904 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd66140c-ae6e-461a-84a9-597fdb115dd8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.068950 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.081969 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.132239 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh92z\" (UniqueName: \"kubernetes.io/projected/d8041594-4bbd-408a-b59d-26bb0e17a95e-kube-api-access-sh92z\") pod \"route-controller-manager-6576b87f9c-78r6p\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.150316 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf962\" (UniqueName: \"kubernetes.io/projected/14ae78cd-522f-4125-b2d9-84c52dbeadcb-kube-api-access-jf962\") pod \"openshift-apiserver-operator-796bbdcf4f-cz27n\" (UID: \"14ae78cd-522f-4125-b2d9-84c52dbeadcb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.159152 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v2c2\" (UniqueName: \"kubernetes.io/projected/a06725c1-5841-4d3b-ae47-c78a608229e0-kube-api-access-2v2c2\") pod \"machine-approver-56656f9798-2dpsl\" (UID: \"a06725c1-5841-4d3b-ae47-c78a608229e0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.185815 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc6kl\" (UniqueName: \"kubernetes.io/projected/c3b68206-2dd1-410e-930d-a97b21caddc9-kube-api-access-vc6kl\") pod \"controller-manager-879f6c89f-kwb4k\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.190289 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.198305 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.200652 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.202775 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zmhz\" (UniqueName: \"kubernetes.io/projected/3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a-kube-api-access-2zmhz\") pod \"machine-api-operator-5694c8668f-kvh84\" (UID: \"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.221699 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.241228 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.261957 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.276192 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.316848 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78dhb\" (UniqueName: \"kubernetes.io/projected/368b0e75-e87a-43f9-9369-588871bf28be-kube-api-access-78dhb\") pod \"oauth-openshift-558db77b4-7gfg9\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.320595 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlz89\" (UniqueName: \"kubernetes.io/projected/dae6a345-cb5d-4553-868f-232fc4ec81af-kube-api-access-xlz89\") pod \"downloads-7954f5f757-h8r8v\" (UID: \"dae6a345-cb5d-4553-868f-232fc4ec81af\") " pod="openshift-console/downloads-7954f5f757-h8r8v" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.344117 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kcw9\" (UniqueName: \"kubernetes.io/projected/e80a24bf-734e-476e-9559-4b1bc913802a-kube-api-access-8kcw9\") pod \"apiserver-76f77b778f-lspxp\" (UID: \"e80a24bf-734e-476e-9559-4b1bc913802a\") " pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.344507 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.357783 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rmcv\" (UniqueName: \"kubernetes.io/projected/a67d7087-6ab3-42ff-b5cc-cba186b5b036-kube-api-access-6rmcv\") pod \"apiserver-7bbb656c7d-ht5gd\" (UID: \"a67d7087-6ab3-42ff-b5cc-cba186b5b036\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.378596 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.380134 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62kkf\" (UniqueName: \"kubernetes.io/projected/6d6a8bb4-df10-46c3-91e6-826e501be09f-kube-api-access-62kkf\") pod \"image-pruner-29554560-f9x27\" (UID: \"6d6a8bb4-df10-46c3-91e6-826e501be09f\") " pod="openshift-image-registry/image-pruner-29554560-f9x27" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.400808 4870 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.412476 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.422259 4870 request.go:700] Waited for 1.954994831s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.423965 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.436606 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxc2q\" (UniqueName: \"kubernetes.io/projected/6a38e148-742e-4f33-a30c-7289fad54acb-kube-api-access-nxc2q\") pod \"cluster-samples-operator-665b6dd947-wzkc5\" (UID: \"6a38e148-742e-4f33-a30c-7289fad54acb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.441774 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.465001 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.467397 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.480314 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29554560-f9x27" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.480348 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.500569 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.519497 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.520585 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.535440 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p"] Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.536960 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n"] Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.546724 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.555294 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.561979 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-h8r8v" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.562937 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 12 00:11:46 crc kubenswrapper[4870]: W0312 00:11:46.570995 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14ae78cd_522f_4125_b2d9_84c52dbeadcb.slice/crio-8f102792ec4f68ada71b5aa10b40c210673e2b401233c38b1bd93b8031060be7 WatchSource:0}: Error finding container 8f102792ec4f68ada71b5aa10b40c210673e2b401233c38b1bd93b8031060be7: Status 404 returned error can't find the container with id 8f102792ec4f68ada71b5aa10b40c210673e2b401233c38b1bd93b8031060be7 Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.594348 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kwb4k"] Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.596135 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4g7j\" (UniqueName: \"kubernetes.io/projected/464113cb-0982-46cb-91e1-95fc6e7a9f83-kube-api-access-r4g7j\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.622561 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/464113cb-0982-46cb-91e1-95fc6e7a9f83-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-b2d8d\" (UID: \"464113cb-0982-46cb-91e1-95fc6e7a9f83\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.640866 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s48n\" (UniqueName: \"kubernetes.io/projected/ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12-kube-api-access-6s48n\") pod \"service-ca-9c57cc56f-bbcc4\" (UID: \"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.664050 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9qkf\" (UniqueName: \"kubernetes.io/projected/ee267da0-c1eb-4a5d-80d5-da65c77a7c23-kube-api-access-x9qkf\") pod \"migrator-59844c95c7-7ccgj\" (UID: \"ee267da0-c1eb-4a5d-80d5-da65c77a7c23\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.678947 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8kqp\" (UniqueName: \"kubernetes.io/projected/bfd7abe7-1271-40a0-b011-eb4841fb3c03-kube-api-access-d8kqp\") pod \"machine-config-operator-74547568cd-5frd2\" (UID: \"bfd7abe7-1271-40a0-b011-eb4841fb3c03\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.687495 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.694310 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggv9\" (UniqueName: \"kubernetes.io/projected/2554ffbb-61ab-48bc-bc78-05ae8517f40d-kube-api-access-lggv9\") pod \"console-operator-58897d9998-5l4xj\" (UID: \"2554ffbb-61ab-48bc-bc78-05ae8517f40d\") " pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.700598 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.732038 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5hrl\" (UniqueName: \"kubernetes.io/projected/8d26541a-27be-4bb8-99f2-43f63e4729a2-kube-api-access-z5hrl\") pod \"console-f9d7485db-vbgrg\" (UID: \"8d26541a-27be-4bb8-99f2-43f63e4729a2\") " pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.736882 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxbxg\" (UniqueName: \"kubernetes.io/projected/0498d476-ade9-410a-93e3-5f344bccd8ba-kube-api-access-xxbxg\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6qxr\" (UID: \"0498d476-ade9-410a-93e3-5f344bccd8ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.776423 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.776691 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7gfg9"] Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.785775 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdpxh\" (UniqueName: \"kubernetes.io/projected/a0cb447f-792e-4438-b3f2-32bd2f408f03-kube-api-access-bdpxh\") pod \"kube-storage-version-migrator-operator-b67b599dd-ghs7z\" (UID: \"a0cb447f-792e-4438-b3f2-32bd2f408f03\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.807995 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0465219-4339-46be-90ab-0e4519f19493-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k84g8\" (UID: \"c0465219-4339-46be-90ab-0e4519f19493\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.812593 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zr9\" (UniqueName: \"kubernetes.io/projected/93b8bee4-6475-4c86-abeb-92e555f4e1eb-kube-api-access-g8zr9\") pod \"openshift-config-operator-7777fb866f-w8l5k\" (UID: \"93b8bee4-6475-4c86-abeb-92e555f4e1eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.816213 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd66140c-ae6e-461a-84a9-597fdb115dd8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-j5lkw\" (UID: \"fd66140c-ae6e-461a-84a9-597fdb115dd8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.834903 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd5qv\" (UniqueName: \"kubernetes.io/projected/8dc311ee-fc53-4e2b-8b1d-d512f36208cb-kube-api-access-sd5qv\") pod \"etcd-operator-b45778765-5l64f\" (UID: \"8dc311ee-fc53-4e2b-8b1d-d512f36208cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:46 crc kubenswrapper[4870]: W0312 00:11:46.853055 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod368b0e75_e87a_43f9_9369_588871bf28be.slice/crio-8babf072d8163389193d6157059eab62d4a37451c3217f76c6e5c63bb846ebc9 WatchSource:0}: Error finding container 8babf072d8163389193d6157059eab62d4a37451c3217f76c6e5c63bb846ebc9: Status 404 returned error can't find the container with id 8babf072d8163389193d6157059eab62d4a37451c3217f76c6e5c63bb846ebc9 Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.888243 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.889321 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.890762 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svfw9\" (UniqueName: \"kubernetes.io/projected/581f86c7-aa5f-4071-bba8-fd537cb93402-kube-api-access-svfw9\") pod \"catalog-operator-68c6474976-4kkqt\" (UID: \"581f86c7-aa5f-4071-bba8-fd537cb93402\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.890830 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.891135 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.891233 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.891933 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/581f86c7-aa5f-4071-bba8-fd537cb93402-profile-collector-cert\") pod \"catalog-operator-68c6474976-4kkqt\" (UID: \"581f86c7-aa5f-4071-bba8-fd537cb93402\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.892170 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/581f86c7-aa5f-4071-bba8-fd537cb93402-srv-cert\") pod \"catalog-operator-68c6474976-4kkqt\" (UID: \"581f86c7-aa5f-4071-bba8-fd537cb93402\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.892264 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-trusted-ca\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.892284 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prqs5\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-kube-api-access-prqs5\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.892338 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-tls\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.892374 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-certificates\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.892754 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-bound-sa-token\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:46 crc kubenswrapper[4870]: E0312 00:11:46.894348 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:47.394331198 +0000 UTC m=+197.997747578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.896385 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.901863 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.907991 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd"] Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.908230 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.926230 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kvh84"] Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.927306 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lspxp"] Mar 12 00:11:46 crc kubenswrapper[4870]: W0312 00:11:46.945362 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda67d7087_6ab3_42ff_b5cc_cba186b5b036.slice/crio-19d20adf0d6de1a3daf6ae536c9bbab1fe6a9bdba73e728005292f8ef03f53eb WatchSource:0}: Error finding container 19d20adf0d6de1a3daf6ae536c9bbab1fe6a9bdba73e728005292f8ef03f53eb: Status 404 returned error can't find the container with id 19d20adf0d6de1a3daf6ae536c9bbab1fe6a9bdba73e728005292f8ef03f53eb Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.947836 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5"] Mar 12 00:11:46 crc kubenswrapper[4870]: W0312 00:11:46.955442 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode80a24bf_734e_476e_9559_4b1bc913802a.slice/crio-b9f9817f85dc6c38fdc403cc1fdb08caa466d303089d02a9ce5e49cc4713b291 WatchSource:0}: Error finding container b9f9817f85dc6c38fdc403cc1fdb08caa466d303089d02a9ce5e49cc4713b291: Status 404 returned error can't find the container with id b9f9817f85dc6c38fdc403cc1fdb08caa466d303089d02a9ce5e49cc4713b291 Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.956171 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.964077 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" Mar 12 00:11:46 crc kubenswrapper[4870]: W0312 00:11:46.969073 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ff417d8_c2c5_40bf_bc0a_2718a9f88e2a.slice/crio-801429c3c86b49e419b8ec526eeb8e8a670e2fb7271776a748dc2cd1cdc293da WatchSource:0}: Error finding container 801429c3c86b49e419b8ec526eeb8e8a670e2fb7271776a748dc2cd1cdc293da: Status 404 returned error can't find the container with id 801429c3c86b49e419b8ec526eeb8e8a670e2fb7271776a748dc2cd1cdc293da Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.979095 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-h8r8v"] Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.979213 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.994068 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.995354 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-tls\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:46 crc kubenswrapper[4870]: I0312 00:11:46.996451 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-certificates\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:46.996569 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:47.496542943 +0000 UTC m=+198.099959253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.996649 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb913154-2066-4879-9598-0a72095a8a5d-serving-cert\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998042 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29554560-f9x27"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998487 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ba1c30d1-c2ba-42ce-82d5-7602956ff030-default-certificate\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998587 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb913154-2066-4879-9598-0a72095a8a5d-service-ca-bundle\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998613 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmnlr\" (UniqueName: \"kubernetes.io/projected/ba1c30d1-c2ba-42ce-82d5-7602956ff030-kube-api-access-xmnlr\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998663 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2507766-d14c-437b-a485-91563bb9b272-trusted-ca\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998703 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998730 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f902cf-f8f8-4895-8238-691ef6d7686d-metrics-tls\") pod \"dns-operator-744455d44c-8jsrj\" (UID: \"33f902cf-f8f8-4895-8238-691ef6d7686d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998753 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjmht\" (UniqueName: \"kubernetes.io/projected/eb913154-2066-4879-9598-0a72095a8a5d-kube-api-access-wjmht\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998781 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998818 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/581f86c7-aa5f-4071-bba8-fd537cb93402-srv-cert\") pod \"catalog-operator-68c6474976-4kkqt\" (UID: \"581f86c7-aa5f-4071-bba8-fd537cb93402\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998844 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvqs2\" (UniqueName: \"kubernetes.io/projected/33f902cf-f8f8-4895-8238-691ef6d7686d-kube-api-access-bvqs2\") pod \"dns-operator-744455d44c-8jsrj\" (UID: \"33f902cf-f8f8-4895-8238-691ef6d7686d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998881 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-trusted-ca\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998902 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prqs5\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-kube-api-access-prqs5\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998924 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb913154-2066-4879-9598-0a72095a8a5d-config\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998926 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-certificates\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998948 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb913154-2066-4879-9598-0a72095a8a5d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.998973 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba1c30d1-c2ba-42ce-82d5-7602956ff030-metrics-certs\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.999387 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-bound-sa-token\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:46.999418 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ba1c30d1-c2ba-42ce-82d5-7602956ff030-stats-auth\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:46.999656 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:47.499644705 +0000 UTC m=+198.103061095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.000225 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2507766-d14c-437b-a485-91563bb9b272-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.000452 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba1c30d1-c2ba-42ce-82d5-7602956ff030-service-ca-bundle\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.000522 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svfw9\" (UniqueName: \"kubernetes.io/projected/581f86c7-aa5f-4071-bba8-fd537cb93402-kube-api-access-svfw9\") pod \"catalog-operator-68c6474976-4kkqt\" (UID: \"581f86c7-aa5f-4071-bba8-fd537cb93402\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.000668 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.001242 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.001850 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-tls\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.001958 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2507766-d14c-437b-a485-91563bb9b272-metrics-tls\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.002157 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/581f86c7-aa5f-4071-bba8-fd537cb93402-profile-collector-cert\") pod \"catalog-operator-68c6474976-4kkqt\" (UID: \"581f86c7-aa5f-4071-bba8-fd537cb93402\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.002178 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqm6s\" (UniqueName: \"kubernetes.io/projected/d2507766-d14c-437b-a485-91563bb9b272-kube-api-access-dqm6s\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.003041 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-trusted-ca\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.005610 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.007743 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/581f86c7-aa5f-4071-bba8-fd537cb93402-profile-collector-cert\") pod \"catalog-operator-68c6474976-4kkqt\" (UID: \"581f86c7-aa5f-4071-bba8-fd537cb93402\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.010372 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/581f86c7-aa5f-4071-bba8-fd537cb93402-srv-cert\") pod \"catalog-operator-68c6474976-4kkqt\" (UID: \"581f86c7-aa5f-4071-bba8-fd537cb93402\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.031628 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.037348 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.040889 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prqs5\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-kube-api-access-prqs5\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.058069 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bbcc4"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.058935 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-bound-sa-token\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.085558 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svfw9\" (UniqueName: \"kubernetes.io/projected/581f86c7-aa5f-4071-bba8-fd537cb93402-kube-api-access-svfw9\") pod \"catalog-operator-68c6474976-4kkqt\" (UID: \"581f86c7-aa5f-4071-bba8-fd537cb93402\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.090213 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" event={"ID":"e80a24bf-734e-476e-9559-4b1bc913802a","Type":"ContainerStarted","Data":"b9f9817f85dc6c38fdc403cc1fdb08caa466d303089d02a9ce5e49cc4713b291"} Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.103776 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.103895 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:47.603870249 +0000 UTC m=+198.207286559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104245 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ba1c30d1-c2ba-42ce-82d5-7602956ff030-stats-auth\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104297 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqsj7\" (UniqueName: \"kubernetes.io/projected/0d1beb33-53de-4a02-acec-735ca52df759-kube-api-access-kqsj7\") pod \"machine-config-controller-84d6567774-l5dpd\" (UID: \"0d1beb33-53de-4a02-acec-735ca52df759\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104317 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2507766-d14c-437b-a485-91563bb9b272-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104337 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b9a3066-da46-44d9-9cae-9b073151540c-config\") pod \"kube-apiserver-operator-766d6c64bb-cqcmc\" (UID: \"0b9a3066-da46-44d9-9cae-9b073151540c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104395 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-socket-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104410 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aded5c32-6731-43cc-8701-4d847d663dd2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bvpw\" (UID: \"aded5c32-6731-43cc-8701-4d847d663dd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104439 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83e637c-fad7-47e8-a70b-8871cf03c832-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mw8sm\" (UID: \"c83e637c-fad7-47e8-a70b-8871cf03c832\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104467 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba1c30d1-c2ba-42ce-82d5-7602956ff030-service-ca-bundle\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104551 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-plugins-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104600 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-config-volume\") pod \"collect-profiles-29554560-rqkss\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104615 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b9a3066-da46-44d9-9cae-9b073151540c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cqcmc\" (UID: \"0b9a3066-da46-44d9-9cae-9b073151540c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104631 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtc9z\" (UniqueName: \"kubernetes.io/projected/dd365575-f7d8-45f6-b5fc-ef6069e47374-kube-api-access-mtc9z\") pod \"dns-default-nkq8z\" (UID: \"dd365575-f7d8-45f6-b5fc-ef6069e47374\") " pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104653 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9xvd\" (UniqueName: \"kubernetes.io/projected/eb28d2f3-0064-4c7b-a036-1a8080bff91b-kube-api-access-r9xvd\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104669 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm44j\" (UniqueName: \"kubernetes.io/projected/68521596-9390-4238-b387-99895749ff85-kube-api-access-jm44j\") pod \"ingress-canary-k5ks6\" (UID: \"68521596-9390-4238-b387-99895749ff85\") " pod="openshift-ingress-canary/ingress-canary-k5ks6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104704 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2507766-d14c-437b-a485-91563bb9b272-metrics-tls\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104721 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b9a3066-da46-44d9-9cae-9b073151540c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cqcmc\" (UID: \"0b9a3066-da46-44d9-9cae-9b073151540c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104739 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ml8m\" (UniqueName: \"kubernetes.io/projected/cf754ba1-52f1-478d-9b07-1d83e55d3020-kube-api-access-4ml8m\") pod \"auto-csr-approver-29554570-l4btp\" (UID: \"cf754ba1-52f1-478d-9b07-1d83e55d3020\") " pod="openshift-infra/auto-csr-approver-29554570-l4btp" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104756 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqm6s\" (UniqueName: \"kubernetes.io/projected/d2507766-d14c-437b-a485-91563bb9b272-kube-api-access-dqm6s\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104787 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/eb28d2f3-0064-4c7b-a036-1a8080bff91b-tmpfs\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104802 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zmpm\" (UniqueName: \"kubernetes.io/projected/596347fa-d520-46af-b25c-860d7c0d91a4-kube-api-access-7zmpm\") pod \"marketplace-operator-79b997595-6znk2\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104816 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-secret-volume\") pod \"collect-profiles-29554560-rqkss\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104830 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbdn9\" (UniqueName: \"kubernetes.io/projected/ca72e460-0d01-4a2d-9796-c3a65dd38aec-kube-api-access-xbdn9\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104856 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d0aa413-952c-42dd-9267-a3f29e9bebe1-srv-cert\") pod \"olm-operator-6b444d44fb-tg468\" (UID: \"6d0aa413-952c-42dd-9267-a3f29e9bebe1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104873 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb28d2f3-0064-4c7b-a036-1a8080bff91b-webhook-cert\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104897 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6znk2\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.104993 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68521596-9390-4238-b387-99895749ff85-cert\") pod \"ingress-canary-k5ks6\" (UID: \"68521596-9390-4238-b387-99895749ff85\") " pod="openshift-ingress-canary/ingress-canary-k5ks6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.105109 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-868ws\" (UniqueName: \"kubernetes.io/projected/aded5c32-6731-43cc-8701-4d847d663dd2-kube-api-access-868ws\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bvpw\" (UID: \"aded5c32-6731-43cc-8701-4d847d663dd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.109588 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb913154-2066-4879-9598-0a72095a8a5d-serving-cert\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.109714 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d00ae6-5fa7-4282-bb14-37c7f3202784-serving-cert\") pod \"service-ca-operator-777779d784-gptpv\" (UID: \"b0d00ae6-5fa7-4282-bb14-37c7f3202784\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.109748 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd365575-f7d8-45f6-b5fc-ef6069e47374-metrics-tls\") pod \"dns-default-nkq8z\" (UID: \"dd365575-f7d8-45f6-b5fc-ef6069e47374\") " pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.109843 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ba1c30d1-c2ba-42ce-82d5-7602956ff030-default-certificate\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.109872 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fafe5ce-3f4c-4e99-b673-edd2836d0392-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-clrh6\" (UID: \"1fafe5ce-3f4c-4e99-b673-edd2836d0392\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.109908 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d0aa413-952c-42dd-9267-a3f29e9bebe1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tg468\" (UID: \"6d0aa413-952c-42dd-9267-a3f29e9bebe1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.109942 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnhmf\" (UniqueName: \"kubernetes.io/projected/b0d00ae6-5fa7-4282-bb14-37c7f3202784-kube-api-access-pnhmf\") pod \"service-ca-operator-777779d784-gptpv\" (UID: \"b0d00ae6-5fa7-4282-bb14-37c7f3202784\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110037 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-registration-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110182 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jfpx\" (UniqueName: \"kubernetes.io/projected/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-kube-api-access-9jfpx\") pod \"collect-profiles-29554560-rqkss\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110211 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-mountpoint-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110291 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb913154-2066-4879-9598-0a72095a8a5d-service-ca-bundle\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110319 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-csi-data-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110354 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmnlr\" (UniqueName: \"kubernetes.io/projected/ba1c30d1-c2ba-42ce-82d5-7602956ff030-kube-api-access-xmnlr\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110388 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c9r7\" (UniqueName: \"kubernetes.io/projected/3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e-kube-api-access-9c9r7\") pod \"machine-config-server-7vhpt\" (UID: \"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e\") " pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110443 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2507766-d14c-437b-a485-91563bb9b272-trusted-ca\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110530 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110585 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f902cf-f8f8-4895-8238-691ef6d7686d-metrics-tls\") pod \"dns-operator-744455d44c-8jsrj\" (UID: \"33f902cf-f8f8-4895-8238-691ef6d7686d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110645 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjmht\" (UniqueName: \"kubernetes.io/projected/eb913154-2066-4879-9598-0a72095a8a5d-kube-api-access-wjmht\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110678 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwwqd\" (UniqueName: \"kubernetes.io/projected/c83e637c-fad7-47e8-a70b-8871cf03c832-kube-api-access-fwwqd\") pod \"package-server-manager-789f6589d5-mw8sm\" (UID: \"c83e637c-fad7-47e8-a70b-8871cf03c832\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110764 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e-node-bootstrap-token\") pod \"machine-config-server-7vhpt\" (UID: \"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e\") " pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110797 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb28d2f3-0064-4c7b-a036-1a8080bff91b-apiservice-cert\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110825 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d00ae6-5fa7-4282-bb14-37c7f3202784-config\") pod \"service-ca-operator-777779d784-gptpv\" (UID: \"b0d00ae6-5fa7-4282-bb14-37c7f3202784\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110906 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvqs2\" (UniqueName: \"kubernetes.io/projected/33f902cf-f8f8-4895-8238-691ef6d7686d-kube-api-access-bvqs2\") pod \"dns-operator-744455d44c-8jsrj\" (UID: \"33f902cf-f8f8-4895-8238-691ef6d7686d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110938 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9lrj\" (UniqueName: \"kubernetes.io/projected/1fafe5ce-3f4c-4e99-b673-edd2836d0392-kube-api-access-r9lrj\") pod \"multus-admission-controller-857f4d67dd-clrh6\" (UID: \"1fafe5ce-3f4c-4e99-b673-edd2836d0392\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.110972 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd365575-f7d8-45f6-b5fc-ef6069e47374-config-volume\") pod \"dns-default-nkq8z\" (UID: \"dd365575-f7d8-45f6-b5fc-ef6069e47374\") " pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.111043 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb913154-2066-4879-9598-0a72095a8a5d-config\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.111072 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb913154-2066-4879-9598-0a72095a8a5d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.111108 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba1c30d1-c2ba-42ce-82d5-7602956ff030-metrics-certs\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.111133 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gmwf\" (UniqueName: \"kubernetes.io/projected/6d0aa413-952c-42dd-9267-a3f29e9bebe1-kube-api-access-9gmwf\") pod \"olm-operator-6b444d44fb-tg468\" (UID: \"6d0aa413-952c-42dd-9267-a3f29e9bebe1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.111194 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d1beb33-53de-4a02-acec-735ca52df759-proxy-tls\") pod \"machine-config-controller-84d6567774-l5dpd\" (UID: \"0d1beb33-53de-4a02-acec-735ca52df759\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.111231 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0d1beb33-53de-4a02-acec-735ca52df759-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l5dpd\" (UID: \"0d1beb33-53de-4a02-acec-735ca52df759\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.111283 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6znk2\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.111312 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e-certs\") pod \"machine-config-server-7vhpt\" (UID: \"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e\") " pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.117035 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb913154-2066-4879-9598-0a72095a8a5d-serving-cert\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.121932 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ba1c30d1-c2ba-42ce-82d5-7602956ff030-default-certificate\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.122307 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ba1c30d1-c2ba-42ce-82d5-7602956ff030-stats-auth\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.126184 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:47.62613639 +0000 UTC m=+198.229552700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.126317 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb913154-2066-4879-9598-0a72095a8a5d-config\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.126338 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb913154-2066-4879-9598-0a72095a8a5d-service-ca-bundle\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.128815 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2507766-d14c-437b-a485-91563bb9b272-trusted-ca\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.129139 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba1c30d1-c2ba-42ce-82d5-7602956ff030-metrics-certs\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.138138 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f902cf-f8f8-4895-8238-691ef6d7686d-metrics-tls\") pod \"dns-operator-744455d44c-8jsrj\" (UID: \"33f902cf-f8f8-4895-8238-691ef6d7686d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.138343 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba1c30d1-c2ba-42ce-82d5-7602956ff030-service-ca-bundle\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.148773 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" event={"ID":"d8041594-4bbd-408a-b59d-26bb0e17a95e","Type":"ContainerStarted","Data":"8c8957f063e26179f47fd70fd1dd02dae3342fd106edd11a67f094216c1ba36d"} Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.148849 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2507766-d14c-437b-a485-91563bb9b272-metrics-tls\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.150344 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb913154-2066-4879-9598-0a72095a8a5d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.150676 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vbgrg"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.155158 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2507766-d14c-437b-a485-91563bb9b272-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.192326 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" event={"ID":"a06725c1-5841-4d3b-ae47-c78a608229e0","Type":"ContainerStarted","Data":"9c11b57ecb00600f002566ab2a9cfa2b86e73391a08edf93a4a6e5a86824bfc6"} Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.192369 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" event={"ID":"a06725c1-5841-4d3b-ae47-c78a608229e0","Type":"ContainerStarted","Data":"eaaec6e7ebd0690eb4df1931399c142b59cb4abe167df91d3c0c0de6a8c48563"} Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.198426 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" event={"ID":"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a","Type":"ContainerStarted","Data":"801429c3c86b49e419b8ec526eeb8e8a670e2fb7271776a748dc2cd1cdc293da"} Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.204375 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" event={"ID":"c3b68206-2dd1-410e-930d-a97b21caddc9","Type":"ContainerStarted","Data":"98d3823efa951ecc1afedd07bfe1e8b717e75443d0bd9b72f78e34470b4e3702"} Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.205823 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvqs2\" (UniqueName: \"kubernetes.io/projected/33f902cf-f8f8-4895-8238-691ef6d7686d-kube-api-access-bvqs2\") pod \"dns-operator-744455d44c-8jsrj\" (UID: \"33f902cf-f8f8-4895-8238-691ef6d7686d\") " pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212181 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212510 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d1beb33-53de-4a02-acec-735ca52df759-proxy-tls\") pod \"machine-config-controller-84d6567774-l5dpd\" (UID: \"0d1beb33-53de-4a02-acec-735ca52df759\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212540 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0d1beb33-53de-4a02-acec-735ca52df759-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l5dpd\" (UID: \"0d1beb33-53de-4a02-acec-735ca52df759\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212570 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6znk2\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212591 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e-certs\") pod \"machine-config-server-7vhpt\" (UID: \"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e\") " pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212612 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqsj7\" (UniqueName: \"kubernetes.io/projected/0d1beb33-53de-4a02-acec-735ca52df759-kube-api-access-kqsj7\") pod \"machine-config-controller-84d6567774-l5dpd\" (UID: \"0d1beb33-53de-4a02-acec-735ca52df759\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212630 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b9a3066-da46-44d9-9cae-9b073151540c-config\") pod \"kube-apiserver-operator-766d6c64bb-cqcmc\" (UID: \"0b9a3066-da46-44d9-9cae-9b073151540c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212632 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212668 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-socket-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212689 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aded5c32-6731-43cc-8701-4d847d663dd2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bvpw\" (UID: \"aded5c32-6731-43cc-8701-4d847d663dd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212713 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83e637c-fad7-47e8-a70b-8871cf03c832-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mw8sm\" (UID: \"c83e637c-fad7-47e8-a70b-8871cf03c832\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212742 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-plugins-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212771 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-config-volume\") pod \"collect-profiles-29554560-rqkss\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212798 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b9a3066-da46-44d9-9cae-9b073151540c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cqcmc\" (UID: \"0b9a3066-da46-44d9-9cae-9b073151540c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212818 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtc9z\" (UniqueName: \"kubernetes.io/projected/dd365575-f7d8-45f6-b5fc-ef6069e47374-kube-api-access-mtc9z\") pod \"dns-default-nkq8z\" (UID: \"dd365575-f7d8-45f6-b5fc-ef6069e47374\") " pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212849 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9xvd\" (UniqueName: \"kubernetes.io/projected/eb28d2f3-0064-4c7b-a036-1a8080bff91b-kube-api-access-r9xvd\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212871 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm44j\" (UniqueName: \"kubernetes.io/projected/68521596-9390-4238-b387-99895749ff85-kube-api-access-jm44j\") pod \"ingress-canary-k5ks6\" (UID: \"68521596-9390-4238-b387-99895749ff85\") " pod="openshift-ingress-canary/ingress-canary-k5ks6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212893 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b9a3066-da46-44d9-9cae-9b073151540c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cqcmc\" (UID: \"0b9a3066-da46-44d9-9cae-9b073151540c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212913 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ml8m\" (UniqueName: \"kubernetes.io/projected/cf754ba1-52f1-478d-9b07-1d83e55d3020-kube-api-access-4ml8m\") pod \"auto-csr-approver-29554570-l4btp\" (UID: \"cf754ba1-52f1-478d-9b07-1d83e55d3020\") " pod="openshift-infra/auto-csr-approver-29554570-l4btp" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212942 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/eb28d2f3-0064-4c7b-a036-1a8080bff91b-tmpfs\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212961 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zmpm\" (UniqueName: \"kubernetes.io/projected/596347fa-d520-46af-b25c-860d7c0d91a4-kube-api-access-7zmpm\") pod \"marketplace-operator-79b997595-6znk2\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.212981 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-secret-volume\") pod \"collect-profiles-29554560-rqkss\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213001 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbdn9\" (UniqueName: \"kubernetes.io/projected/ca72e460-0d01-4a2d-9796-c3a65dd38aec-kube-api-access-xbdn9\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213022 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d0aa413-952c-42dd-9267-a3f29e9bebe1-srv-cert\") pod \"olm-operator-6b444d44fb-tg468\" (UID: \"6d0aa413-952c-42dd-9267-a3f29e9bebe1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213044 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb28d2f3-0064-4c7b-a036-1a8080bff91b-webhook-cert\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213066 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6znk2\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213087 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68521596-9390-4238-b387-99895749ff85-cert\") pod \"ingress-canary-k5ks6\" (UID: \"68521596-9390-4238-b387-99895749ff85\") " pod="openshift-ingress-canary/ingress-canary-k5ks6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213111 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-868ws\" (UniqueName: \"kubernetes.io/projected/aded5c32-6731-43cc-8701-4d847d663dd2-kube-api-access-868ws\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bvpw\" (UID: \"aded5c32-6731-43cc-8701-4d847d663dd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213134 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d00ae6-5fa7-4282-bb14-37c7f3202784-serving-cert\") pod \"service-ca-operator-777779d784-gptpv\" (UID: \"b0d00ae6-5fa7-4282-bb14-37c7f3202784\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213171 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd365575-f7d8-45f6-b5fc-ef6069e47374-metrics-tls\") pod \"dns-default-nkq8z\" (UID: \"dd365575-f7d8-45f6-b5fc-ef6069e47374\") " pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213197 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fafe5ce-3f4c-4e99-b673-edd2836d0392-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-clrh6\" (UID: \"1fafe5ce-3f4c-4e99-b673-edd2836d0392\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213221 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d0aa413-952c-42dd-9267-a3f29e9bebe1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tg468\" (UID: \"6d0aa413-952c-42dd-9267-a3f29e9bebe1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213244 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnhmf\" (UniqueName: \"kubernetes.io/projected/b0d00ae6-5fa7-4282-bb14-37c7f3202784-kube-api-access-pnhmf\") pod \"service-ca-operator-777779d784-gptpv\" (UID: \"b0d00ae6-5fa7-4282-bb14-37c7f3202784\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213273 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-registration-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213296 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jfpx\" (UniqueName: \"kubernetes.io/projected/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-kube-api-access-9jfpx\") pod \"collect-profiles-29554560-rqkss\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213320 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-mountpoint-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213339 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-csi-data-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213368 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c9r7\" (UniqueName: \"kubernetes.io/projected/3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e-kube-api-access-9c9r7\") pod \"machine-config-server-7vhpt\" (UID: \"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e\") " pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213402 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwwqd\" (UniqueName: \"kubernetes.io/projected/c83e637c-fad7-47e8-a70b-8871cf03c832-kube-api-access-fwwqd\") pod \"package-server-manager-789f6589d5-mw8sm\" (UID: \"c83e637c-fad7-47e8-a70b-8871cf03c832\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213439 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e-node-bootstrap-token\") pod \"machine-config-server-7vhpt\" (UID: \"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e\") " pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213458 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb28d2f3-0064-4c7b-a036-1a8080bff91b-apiservice-cert\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213478 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d00ae6-5fa7-4282-bb14-37c7f3202784-config\") pod \"service-ca-operator-777779d784-gptpv\" (UID: \"b0d00ae6-5fa7-4282-bb14-37c7f3202784\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213498 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9lrj\" (UniqueName: \"kubernetes.io/projected/1fafe5ce-3f4c-4e99-b673-edd2836d0392-kube-api-access-r9lrj\") pod \"multus-admission-controller-857f4d67dd-clrh6\" (UID: \"1fafe5ce-3f4c-4e99-b673-edd2836d0392\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213519 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd365575-f7d8-45f6-b5fc-ef6069e47374-config-volume\") pod \"dns-default-nkq8z\" (UID: \"dd365575-f7d8-45f6-b5fc-ef6069e47374\") " pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.213541 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gmwf\" (UniqueName: \"kubernetes.io/projected/6d0aa413-952c-42dd-9267-a3f29e9bebe1-kube-api-access-9gmwf\") pod \"olm-operator-6b444d44fb-tg468\" (UID: \"6d0aa413-952c-42dd-9267-a3f29e9bebe1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.218622 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6znk2\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.219326 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0d1beb33-53de-4a02-acec-735ca52df759-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l5dpd\" (UID: \"0d1beb33-53de-4a02-acec-735ca52df759\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.220744 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d1beb33-53de-4a02-acec-735ca52df759-proxy-tls\") pod \"machine-config-controller-84d6567774-l5dpd\" (UID: \"0d1beb33-53de-4a02-acec-735ca52df759\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.221055 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-socket-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.221257 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:47.721122571 +0000 UTC m=+198.324538881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.221920 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b9a3066-da46-44d9-9cae-9b073151540c-config\") pod \"kube-apiserver-operator-766d6c64bb-cqcmc\" (UID: \"0b9a3066-da46-44d9-9cae-9b073151540c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.222496 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmnlr\" (UniqueName: \"kubernetes.io/projected/ba1c30d1-c2ba-42ce-82d5-7602956ff030-kube-api-access-xmnlr\") pod \"router-default-5444994796-647f6\" (UID: \"ba1c30d1-c2ba-42ce-82d5-7602956ff030\") " pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.222538 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6znk2\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.222698 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-registration-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.224020 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb28d2f3-0064-4c7b-a036-1a8080bff91b-webhook-cert\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.224498 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/eb28d2f3-0064-4c7b-a036-1a8080bff91b-tmpfs\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.226490 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-csi-data-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.226581 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-mountpoint-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.226887 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d00ae6-5fa7-4282-bb14-37c7f3202784-config\") pod \"service-ca-operator-777779d784-gptpv\" (UID: \"b0d00ae6-5fa7-4282-bb14-37c7f3202784\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.226963 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ca72e460-0d01-4a2d-9796-c3a65dd38aec-plugins-dir\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.229559 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-secret-volume\") pod \"collect-profiles-29554560-rqkss\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.229632 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e-node-bootstrap-token\") pod \"machine-config-server-7vhpt\" (UID: \"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e\") " pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.229849 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" event={"ID":"14ae78cd-522f-4125-b2d9-84c52dbeadcb","Type":"ContainerStarted","Data":"8f102792ec4f68ada71b5aa10b40c210673e2b401233c38b1bd93b8031060be7"} Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.230252 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d0aa413-952c-42dd-9267-a3f29e9bebe1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tg468\" (UID: \"6d0aa413-952c-42dd-9267-a3f29e9bebe1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.231492 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd365575-f7d8-45f6-b5fc-ef6069e47374-config-volume\") pod \"dns-default-nkq8z\" (UID: \"dd365575-f7d8-45f6-b5fc-ef6069e47374\") " pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.232365 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-config-volume\") pod \"collect-profiles-29554560-rqkss\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.232902 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e-certs\") pod \"machine-config-server-7vhpt\" (UID: \"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e\") " pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.234204 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb28d2f3-0064-4c7b-a036-1a8080bff91b-apiservice-cert\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.237584 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fafe5ce-3f4c-4e99-b673-edd2836d0392-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-clrh6\" (UID: \"1fafe5ce-3f4c-4e99-b673-edd2836d0392\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.237719 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d0aa413-952c-42dd-9267-a3f29e9bebe1-srv-cert\") pod \"olm-operator-6b444d44fb-tg468\" (UID: \"6d0aa413-952c-42dd-9267-a3f29e9bebe1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.237755 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68521596-9390-4238-b387-99895749ff85-cert\") pod \"ingress-canary-k5ks6\" (UID: \"68521596-9390-4238-b387-99895749ff85\") " pod="openshift-ingress-canary/ingress-canary-k5ks6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.237851 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aded5c32-6731-43cc-8701-4d847d663dd2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bvpw\" (UID: \"aded5c32-6731-43cc-8701-4d847d663dd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.238078 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b9a3066-da46-44d9-9cae-9b073151540c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cqcmc\" (UID: \"0b9a3066-da46-44d9-9cae-9b073151540c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.238119 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd365575-f7d8-45f6-b5fc-ef6069e47374-metrics-tls\") pod \"dns-default-nkq8z\" (UID: \"dd365575-f7d8-45f6-b5fc-ef6069e47374\") " pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.241158 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqm6s\" (UniqueName: \"kubernetes.io/projected/d2507766-d14c-437b-a485-91563bb9b272-kube-api-access-dqm6s\") pod \"ingress-operator-5b745b69d9-zsdq9\" (UID: \"d2507766-d14c-437b-a485-91563bb9b272\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.247039 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" event={"ID":"368b0e75-e87a-43f9-9369-588871bf28be","Type":"ContainerStarted","Data":"8babf072d8163389193d6157059eab62d4a37451c3217f76c6e5c63bb846ebc9"} Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.247099 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d00ae6-5fa7-4282-bb14-37c7f3202784-serving-cert\") pod \"service-ca-operator-777779d784-gptpv\" (UID: \"b0d00ae6-5fa7-4282-bb14-37c7f3202784\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.247682 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83e637c-fad7-47e8-a70b-8871cf03c832-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mw8sm\" (UID: \"c83e637c-fad7-47e8-a70b-8871cf03c832\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.251906 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.255117 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" event={"ID":"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12","Type":"ContainerStarted","Data":"2c9028f3ba80971a9c020a37bd6682e94b713f84cf7cbcfa8b05ec11b3ae8925"} Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.257107 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" event={"ID":"a67d7087-6ab3-42ff-b5cc-cba186b5b036","Type":"ContainerStarted","Data":"19d20adf0d6de1a3daf6ae536c9bbab1fe6a9bdba73e728005292f8ef03f53eb"} Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.261436 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjmht\" (UniqueName: \"kubernetes.io/projected/eb913154-2066-4879-9598-0a72095a8a5d-kube-api-access-wjmht\") pod \"authentication-operator-69f744f599-dcrsg\" (UID: \"eb913154-2066-4879-9598-0a72095a8a5d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.282051 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.288083 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.303301 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gmwf\" (UniqueName: \"kubernetes.io/projected/6d0aa413-952c-42dd-9267-a3f29e9bebe1-kube-api-access-9gmwf\") pod \"olm-operator-6b444d44fb-tg468\" (UID: \"6d0aa413-952c-42dd-9267-a3f29e9bebe1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.314784 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.315775 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:47.815762141 +0000 UTC m=+198.419178451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.318924 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtc9z\" (UniqueName: \"kubernetes.io/projected/dd365575-f7d8-45f6-b5fc-ef6069e47374-kube-api-access-mtc9z\") pod \"dns-default-nkq8z\" (UID: \"dd365575-f7d8-45f6-b5fc-ef6069e47374\") " pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.349617 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.355287 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqsj7\" (UniqueName: \"kubernetes.io/projected/0d1beb33-53de-4a02-acec-735ca52df759-kube-api-access-kqsj7\") pod \"machine-config-controller-84d6567774-l5dpd\" (UID: \"0d1beb33-53de-4a02-acec-735ca52df759\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.364279 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.367071 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jfpx\" (UniqueName: \"kubernetes.io/projected/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-kube-api-access-9jfpx\") pod \"collect-profiles-29554560-rqkss\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.377119 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnhmf\" (UniqueName: \"kubernetes.io/projected/b0d00ae6-5fa7-4282-bb14-37c7f3202784-kube-api-access-pnhmf\") pod \"service-ca-operator-777779d784-gptpv\" (UID: \"b0d00ae6-5fa7-4282-bb14-37c7f3202784\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.392213 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.400739 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-868ws\" (UniqueName: \"kubernetes.io/projected/aded5c32-6731-43cc-8701-4d847d663dd2-kube-api-access-868ws\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bvpw\" (UID: \"aded5c32-6731-43cc-8701-4d847d663dd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.412927 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.416637 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.417117 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:47.91710096 +0000 UTC m=+198.520517270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.420396 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9xvd\" (UniqueName: \"kubernetes.io/projected/eb28d2f3-0064-4c7b-a036-1a8080bff91b-kube-api-access-r9xvd\") pod \"packageserver-d55dfcdfc-wdxcq\" (UID: \"eb28d2f3-0064-4c7b-a036-1a8080bff91b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.438450 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm44j\" (UniqueName: \"kubernetes.io/projected/68521596-9390-4238-b387-99895749ff85-kube-api-access-jm44j\") pod \"ingress-canary-k5ks6\" (UID: \"68521596-9390-4238-b387-99895749ff85\") " pod="openshift-ingress-canary/ingress-canary-k5ks6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.449614 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.464263 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b9a3066-da46-44d9-9cae-9b073151540c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cqcmc\" (UID: \"0b9a3066-da46-44d9-9cae-9b073151540c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.473781 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ml8m\" (UniqueName: \"kubernetes.io/projected/cf754ba1-52f1-478d-9b07-1d83e55d3020-kube-api-access-4ml8m\") pod \"auto-csr-approver-29554570-l4btp\" (UID: \"cf754ba1-52f1-478d-9b07-1d83e55d3020\") " pod="openshift-infra/auto-csr-approver-29554570-l4btp" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.482742 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.495932 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zmpm\" (UniqueName: \"kubernetes.io/projected/596347fa-d520-46af-b25c-860d7c0d91a4-kube-api-access-7zmpm\") pod \"marketplace-operator-79b997595-6znk2\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.515001 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.517848 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.518128 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.018115609 +0000 UTC m=+198.621531919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.520295 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9lrj\" (UniqueName: \"kubernetes.io/projected/1fafe5ce-3f4c-4e99-b673-edd2836d0392-kube-api-access-r9lrj\") pod \"multus-admission-controller-857f4d67dd-clrh6\" (UID: \"1fafe5ce-3f4c-4e99-b673-edd2836d0392\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.527501 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.535164 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5l64f"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.538810 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwwqd\" (UniqueName: \"kubernetes.io/projected/c83e637c-fad7-47e8-a70b-8871cf03c832-kube-api-access-fwwqd\") pod \"package-server-manager-789f6589d5-mw8sm\" (UID: \"c83e637c-fad7-47e8-a70b-8871cf03c832\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.540805 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.556578 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbdn9\" (UniqueName: \"kubernetes.io/projected/ca72e460-0d01-4a2d-9796-c3a65dd38aec-kube-api-access-xbdn9\") pod \"csi-hostpathplugin-z5bcc\" (UID: \"ca72e460-0d01-4a2d-9796-c3a65dd38aec\") " pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.572563 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5l4xj"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.588440 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c9r7\" (UniqueName: \"kubernetes.io/projected/3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e-kube-api-access-9c9r7\") pod \"machine-config-server-7vhpt\" (UID: \"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e\") " pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.594382 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.594445 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.603690 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.619762 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.620503 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.620949 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.120930602 +0000 UTC m=+198.724346912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.641377 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" Mar 12 00:11:47 crc kubenswrapper[4870]: W0312 00:11:47.647096 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dc311ee_fc53_4e2b_8b1d_d512f36208cb.slice/crio-9c4046fa935cc5de680ed8558ae5baaf2c8187620c5e8bb4e5110cc56abf821f WatchSource:0}: Error finding container 9c4046fa935cc5de680ed8558ae5baaf2c8187620c5e8bb4e5110cc56abf821f: Status 404 returned error can't find the container with id 9c4046fa935cc5de680ed8558ae5baaf2c8187620c5e8bb4e5110cc56abf821f Mar 12 00:11:47 crc kubenswrapper[4870]: W0312 00:11:47.648038 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0465219_4339_46be_90ab_0e4519f19493.slice/crio-b5b1c52d2b82854302e429757e34a74bd26a6fe82163cb1a336a93afa2a0b90d WatchSource:0}: Error finding container b5b1c52d2b82854302e429757e34a74bd26a6fe82163cb1a336a93afa2a0b90d: Status 404 returned error can't find the container with id b5b1c52d2b82854302e429757e34a74bd26a6fe82163cb1a336a93afa2a0b90d Mar 12 00:11:47 crc kubenswrapper[4870]: W0312 00:11:47.649942 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2554ffbb_61ab_48bc_bc78_05ae8517f40d.slice/crio-b9238c742adad56d5f7f013b9f76680e0b4290aa0c3b7f011db99941a6133560 WatchSource:0}: Error finding container b9238c742adad56d5f7f013b9f76680e0b4290aa0c3b7f011db99941a6133560: Status 404 returned error can't find the container with id b9238c742adad56d5f7f013b9f76680e0b4290aa0c3b7f011db99941a6133560 Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.651379 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.654464 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.659209 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.679908 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554570-l4btp" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.686455 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.700891 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.704354 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.720454 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8jsrj"] Mar 12 00:11:47 crc kubenswrapper[4870]: W0312 00:11:47.721329 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0cb447f_792e_4438_b3f2_32bd2f408f03.slice/crio-3e30e0e24bdd588de284cd0a98490ed92e7bb93e217b2ae76b0a78b7328dbcda WatchSource:0}: Error finding container 3e30e0e24bdd588de284cd0a98490ed92e7bb93e217b2ae76b0a78b7328dbcda: Status 404 returned error can't find the container with id 3e30e0e24bdd588de284cd0a98490ed92e7bb93e217b2ae76b0a78b7328dbcda Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.721418 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.722231 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.723234 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.22321737 +0000 UTC m=+198.826633680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: W0312 00:11:47.726832 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd7abe7_1271_40a0_b011_eb4841fb3c03.slice/crio-4a0836876ab2ee49cc83f5485231cd2e770eec46e17d7ce869a3763a829a8a1c WatchSource:0}: Error finding container 4a0836876ab2ee49cc83f5485231cd2e770eec46e17d7ce869a3763a829a8a1c: Status 404 returned error can't find the container with id 4a0836876ab2ee49cc83f5485231cd2e770eec46e17d7ce869a3763a829a8a1c Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.726925 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k5ks6" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.732823 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.755232 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7vhpt" Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.773003 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.829021 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.829219 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.329189466 +0000 UTC m=+198.932605776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.829419 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.829884 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.329872077 +0000 UTC m=+198.933288387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.919599 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468"] Mar 12 00:11:47 crc kubenswrapper[4870]: I0312 00:11:47.931266 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:47 crc kubenswrapper[4870]: E0312 00:11:47.931888 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.431864975 +0000 UTC m=+199.035281285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.012729 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.032924 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.033296 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.533280886 +0000 UTC m=+199.136697196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: W0312 00:11:48.127597 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d0aa413_952c_42dd_9267_a3f29e9bebe1.slice/crio-f0d67730906e1e4318f7b2b4ba68953fe09622a0c7d0c96bd53fe0a3b6f3c66e WatchSource:0}: Error finding container f0d67730906e1e4318f7b2b4ba68953fe09622a0c7d0c96bd53fe0a3b6f3c66e: Status 404 returned error can't find the container with id f0d67730906e1e4318f7b2b4ba68953fe09622a0c7d0c96bd53fe0a3b6f3c66e Mar 12 00:11:48 crc kubenswrapper[4870]: W0312 00:11:48.131857 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1beb33_53de_4a02_acec_735ca52df759.slice/crio-9d229b328cb5c36d2f4b27fe47c89e00857d9f77968626e62caf819ea77dfd61 WatchSource:0}: Error finding container 9d229b328cb5c36d2f4b27fe47c89e00857d9f77968626e62caf819ea77dfd61: Status 404 returned error can't find the container with id 9d229b328cb5c36d2f4b27fe47c89e00857d9f77968626e62caf819ea77dfd61 Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.133818 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.133956 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.633937485 +0000 UTC m=+199.237353795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.136146 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.137188 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.637175821 +0000 UTC m=+199.240592131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.240090 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.240431 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.740416877 +0000 UTC m=+199.343833177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.296082 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.303983 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" event={"ID":"bfd7abe7-1271-40a0-b011-eb4841fb3c03","Type":"ContainerStarted","Data":"4a0836876ab2ee49cc83f5485231cd2e770eec46e17d7ce869a3763a829a8a1c"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.315077 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.316608 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj" event={"ID":"ee267da0-c1eb-4a5d-80d5-da65c77a7c23","Type":"ContainerStarted","Data":"a1b3292e81f1e6f161633c2cc0366a9fef20c7780891af6ad8c11c429294ab82"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.316669 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj" event={"ID":"ee267da0-c1eb-4a5d-80d5-da65c77a7c23","Type":"ContainerStarted","Data":"48be66dd366b0bfd340682e948ab2adcf43c6cf29ef16ec0b0b00cbebb69930f"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.322375 4870 generic.go:334] "Generic (PLEG): container finished" podID="a67d7087-6ab3-42ff-b5cc-cba186b5b036" containerID="d33b816af64727e7a39fe0e9cb8908593662e171aea14d83bc8b6fa0473158fb" exitCode=0 Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.322729 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" event={"ID":"a67d7087-6ab3-42ff-b5cc-cba186b5b036","Type":"ContainerDied","Data":"d33b816af64727e7a39fe0e9cb8908593662e171aea14d83bc8b6fa0473158fb"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.324077 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" event={"ID":"c0465219-4339-46be-90ab-0e4519f19493","Type":"ContainerStarted","Data":"b5b1c52d2b82854302e429757e34a74bd26a6fe82163cb1a336a93afa2a0b90d"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.325585 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" event={"ID":"c3b68206-2dd1-410e-930d-a97b21caddc9","Type":"ContainerStarted","Data":"6c029b3048c5208524a0f5c9d8d4cd7514e3ec4d0d5134461415c26c6322ea4d"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.327321 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vbgrg" event={"ID":"8d26541a-27be-4bb8-99f2-43f63e4729a2","Type":"ContainerStarted","Data":"5f2acdbbcbe470e0d3f00140833b24e734f3a1e9dba180438a10756db861b12e"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.327374 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vbgrg" event={"ID":"8d26541a-27be-4bb8-99f2-43f63e4729a2","Type":"ContainerStarted","Data":"896839b3035df413b1eee0b33ce0d02efca92981916d124b4f5156ac140f23d8"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.328999 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" event={"ID":"33f902cf-f8f8-4895-8238-691ef6d7686d","Type":"ContainerStarted","Data":"dd512bc74653d89546880c547bfccc8b9c8dfa5395f1256a487a86b028e45912"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.330484 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" event={"ID":"6a38e148-742e-4f33-a30c-7289fad54acb","Type":"ContainerStarted","Data":"31f168be63d1da895422399543cb0fc72fc5fff90dca727e47a3f19eff14970d"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.330515 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" event={"ID":"6a38e148-742e-4f33-a30c-7289fad54acb","Type":"ContainerStarted","Data":"af795f3927eb3db69cc4c2971a9ce6b3161b963868f596b0f35df66ddb21ce08"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.331811 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-h8r8v" event={"ID":"dae6a345-cb5d-4553-868f-232fc4ec81af","Type":"ContainerStarted","Data":"73b7f1906124630229501f7e7825c20e5a2cf7a8574a8673d06c389f78a8a4a5"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.331835 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-h8r8v" event={"ID":"dae6a345-cb5d-4553-868f-232fc4ec81af","Type":"ContainerStarted","Data":"e62cdd14085a1cb95742f47c1cbc65c0776be498b282eddf211cc79b5f26c6ef"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.331955 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-h8r8v" Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.332899 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" event={"ID":"6d0aa413-952c-42dd-9267-a3f29e9bebe1","Type":"ContainerStarted","Data":"f0d67730906e1e4318f7b2b4ba68953fe09622a0c7d0c96bd53fe0a3b6f3c66e"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.333900 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" event={"ID":"93b8bee4-6475-4c86-abeb-92e555f4e1eb","Type":"ContainerStarted","Data":"db1d6caaafbdb13c3272b141ae126a078d7b4e6e52f51ca6e88279d94638b0b6"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.338945 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" event={"ID":"fd66140c-ae6e-461a-84a9-597fdb115dd8","Type":"ContainerStarted","Data":"12246900a7140da56226ce91b19d4733fe859511644dc056e7997c8b2802c4c6"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.339702 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" event={"ID":"8dc311ee-fc53-4e2b-8b1d-d512f36208cb","Type":"ContainerStarted","Data":"9c4046fa935cc5de680ed8558ae5baaf2c8187620c5e8bb4e5110cc56abf821f"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.340294 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5l4xj" event={"ID":"2554ffbb-61ab-48bc-bc78-05ae8517f40d","Type":"ContainerStarted","Data":"b9238c742adad56d5f7f013b9f76680e0b4290aa0c3b7f011db99941a6133560"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.341022 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.343272 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.843261181 +0000 UTC m=+199.446677491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.348471 4870 patch_prober.go:28] interesting pod/downloads-7954f5f757-h8r8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.348542 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h8r8v" podUID="dae6a345-cb5d-4553-868f-232fc4ec81af" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.356599 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" event={"ID":"d8041594-4bbd-408a-b59d-26bb0e17a95e","Type":"ContainerStarted","Data":"8fb02b59d562217cfb4bbd1b6e01206d7f4937ae9016911c86f668bed40e4e44"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.357259 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.368059 4870 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-78r6p container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.368319 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" podUID="d8041594-4bbd-408a-b59d-26bb0e17a95e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.375549 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.378618 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nkq8z"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.379566 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" event={"ID":"a0cb447f-792e-4438-b3f2-32bd2f408f03","Type":"ContainerStarted","Data":"3e30e0e24bdd588de284cd0a98490ed92e7bb93e217b2ae76b0a78b7328dbcda"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.395632 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" event={"ID":"368b0e75-e87a-43f9-9369-588871bf28be","Type":"ContainerStarted","Data":"a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.395965 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.403963 4870 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7gfg9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.404028 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" podUID="368b0e75-e87a-43f9-9369-588871bf28be" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.410504 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" event={"ID":"a06725c1-5841-4d3b-ae47-c78a608229e0","Type":"ContainerStarted","Data":"3203a9fb8fd43e52540930199c001c8e6c31a827c4222fe94fd28b48fb87310a"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.413438 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" event={"ID":"0498d476-ade9-410a-93e3-5f344bccd8ba","Type":"ContainerStarted","Data":"64db7c9edc9841aab3043b978d95534a975c6c452a6d738d2ccfcb5d1410a9a7"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.413474 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" event={"ID":"0498d476-ade9-410a-93e3-5f344bccd8ba","Type":"ContainerStarted","Data":"360fac0fcbffc9d2d2d5694042fc1d2d076e053ea1a42afb30618fb97a7a517d"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.415536 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-647f6" event={"ID":"ba1c30d1-c2ba-42ce-82d5-7602956ff030","Type":"ContainerStarted","Data":"c479d1ab89638b11d62a17305452c32bafcf1738c8dd7a6a4ca4557a2e3923de"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.417695 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" event={"ID":"ae9fc9e4-550c-4015-a46e-c3f2d2dd4a12","Type":"ContainerStarted","Data":"fe5a9efc938d4c2dca4ea78e7f836e6271f152b1e0a80f8865fcc8e035767159"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.436096 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.438599 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gptpv"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.443351 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" event={"ID":"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a","Type":"ContainerStarted","Data":"199138e68d75fd5cd22a1b862b30993ab5a8ffcdb0b26b02ec6e47a3c95628d7"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.443537 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.445133 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:48.945112325 +0000 UTC m=+199.548528635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.448349 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" event={"ID":"464113cb-0982-46cb-91e1-95fc6e7a9f83","Type":"ContainerStarted","Data":"b0eaca8210254dd98889fde40a54b5916c454dde5612b601a72d7a17ed10d2f9"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.448412 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" event={"ID":"464113cb-0982-46cb-91e1-95fc6e7a9f83","Type":"ContainerStarted","Data":"3052ca206c581d32b649d417442ff0b2f29e107855bf96dcef61b36f9ffa0457"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.449886 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.452645 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29554560-f9x27" event={"ID":"6d6a8bb4-df10-46c3-91e6-826e501be09f","Type":"ContainerStarted","Data":"921c157e53260323082f7bc38336cb13365755923e21251a667d9e191cd4367f"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.452683 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29554560-f9x27" event={"ID":"6d6a8bb4-df10-46c3-91e6-826e501be09f","Type":"ContainerStarted","Data":"c8ac2f01ff8482880a2e603a551a25e996764fa3711f536e3daea741e9c36256"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.458948 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" event={"ID":"14ae78cd-522f-4125-b2d9-84c52dbeadcb","Type":"ContainerStarted","Data":"cb65316bfbec4b9ef19586a0b1948bdfaa3c6ae81152d2ec36a56ba55c9c1d15"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.458988 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.462241 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" event={"ID":"0d1beb33-53de-4a02-acec-735ca52df759","Type":"ContainerStarted","Data":"9d229b328cb5c36d2f4b27fe47c89e00857d9f77968626e62caf819ea77dfd61"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.478671 4870 generic.go:334] "Generic (PLEG): container finished" podID="e80a24bf-734e-476e-9559-4b1bc913802a" containerID="18d1c4d2c62288aa6671824ae0ce20713e398f10a9d1debf14d83f796e3f5fce" exitCode=0 Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.478721 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" event={"ID":"e80a24bf-734e-476e-9559-4b1bc913802a","Type":"ContainerDied","Data":"18d1c4d2c62288aa6671824ae0ce20713e398f10a9d1debf14d83f796e3f5fce"} Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.545648 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.547406 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.047387312 +0000 UTC m=+199.650803682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.554630 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6znk2"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.557179 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dcrsg"] Mar 12 00:11:48 crc kubenswrapper[4870]: W0312 00:11:48.612251 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d00ae6_5fa7_4282_bb14_37c7f3202784.slice/crio-75978b7dfd4bbad79f7757cc84ae98fd9679045fb5a6fcd24a74e5a6b5e2e6a9 WatchSource:0}: Error finding container 75978b7dfd4bbad79f7757cc84ae98fd9679045fb5a6fcd24a74e5a6b5e2e6a9: Status 404 returned error can't find the container with id 75978b7dfd4bbad79f7757cc84ae98fd9679045fb5a6fcd24a74e5a6b5e2e6a9 Mar 12 00:11:48 crc kubenswrapper[4870]: W0312 00:11:48.613688 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb28d2f3_0064_4c7b_a036_1a8080bff91b.slice/crio-10a686d60868c27fb29d77a39b962506230637975025d243d701690b03180fb0 WatchSource:0}: Error finding container 10a686d60868c27fb29d77a39b962506230637975025d243d701690b03180fb0: Status 404 returned error can't find the container with id 10a686d60868c27fb29d77a39b962506230637975025d243d701690b03180fb0 Mar 12 00:11:48 crc kubenswrapper[4870]: W0312 00:11:48.626778 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod581f86c7_aa5f_4071_bba8_fd537cb93402.slice/crio-485b09a27a333e66ace6e05c1893deac1fc8fbc1e8089ae4b54c471f9fead617 WatchSource:0}: Error finding container 485b09a27a333e66ace6e05c1893deac1fc8fbc1e8089ae4b54c471f9fead617: Status 404 returned error can't find the container with id 485b09a27a333e66ace6e05c1893deac1fc8fbc1e8089ae4b54c471f9fead617 Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.638614 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k5ks6"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.648320 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.648422 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.148400082 +0000 UTC m=+199.751816392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.648821 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.649784 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.149769432 +0000 UTC m=+199.753185742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.681337 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cz27n" podStartSLOduration=143.681318619 podStartE2EDuration="2m23.681318619s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:48.672496407 +0000 UTC m=+199.275912707" watchObservedRunningTime="2026-03-12 00:11:48.681318619 +0000 UTC m=+199.284734929" Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.694006 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.697735 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554570-l4btp"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.700272 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-clrh6"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.715741 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-z5bcc"] Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.749701 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.750044 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.250029829 +0000 UTC m=+199.853446139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: W0312 00:11:48.791321 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf754ba1_52f1_478d_9b07_1d83e55d3020.slice/crio-c55289b04b7c5d5e73e1b81f6fae47166153af7e59a8e530f3117b3efb0b2891 WatchSource:0}: Error finding container c55289b04b7c5d5e73e1b81f6fae47166153af7e59a8e530f3117b3efb0b2891: Status 404 returned error can't find the container with id c55289b04b7c5d5e73e1b81f6fae47166153af7e59a8e530f3117b3efb0b2891 Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.848846 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.851600 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.851924 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.351910684 +0000 UTC m=+199.955326994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:48 crc kubenswrapper[4870]: I0312 00:11:48.955685 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:48 crc kubenswrapper[4870]: E0312 00:11:48.956024 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.456007685 +0000 UTC m=+200.059423995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.061072 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.061743 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.561727185 +0000 UTC m=+200.165143495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.153643 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-vbgrg" podStartSLOduration=144.153622003 podStartE2EDuration="2m24.153622003s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.153281663 +0000 UTC m=+199.756697983" watchObservedRunningTime="2026-03-12 00:11:49.153622003 +0000 UTC m=+199.757038313" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.161791 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.162203 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.662185488 +0000 UTC m=+200.265601808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.205605 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" podStartSLOduration=143.205583996 podStartE2EDuration="2m23.205583996s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.205074451 +0000 UTC m=+199.808490771" watchObservedRunningTime="2026-03-12 00:11:49.205583996 +0000 UTC m=+199.809000316" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.270088 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-bbcc4" podStartSLOduration=143.270070631 podStartE2EDuration="2m23.270070631s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.269657579 +0000 UTC m=+199.873073909" watchObservedRunningTime="2026-03-12 00:11:49.270070631 +0000 UTC m=+199.873486941" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.270173 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.270592 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.770579636 +0000 UTC m=+200.373995946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.318027 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-b2d8d" podStartSLOduration=144.318003304 podStartE2EDuration="2m24.318003304s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.296278779 +0000 UTC m=+199.899695109" watchObservedRunningTime="2026-03-12 00:11:49.318003304 +0000 UTC m=+199.921419614" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.318277 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" podStartSLOduration=144.318268842 podStartE2EDuration="2m24.318268842s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.315638674 +0000 UTC m=+199.919054984" watchObservedRunningTime="2026-03-12 00:11:49.318268842 +0000 UTC m=+199.921685152" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.384428 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.384781 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.884741466 +0000 UTC m=+200.488157776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.385220 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.385838 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.885819998 +0000 UTC m=+200.489236308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.426574 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29554560-f9x27" podStartSLOduration=144.426557578 podStartE2EDuration="2m24.426557578s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.351907251 +0000 UTC m=+199.955323571" watchObservedRunningTime="2026-03-12 00:11:49.426557578 +0000 UTC m=+200.029973888" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.446382 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-h8r8v" podStartSLOduration=144.446365586 podStartE2EDuration="2m24.446365586s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.446339365 +0000 UTC m=+200.049755675" watchObservedRunningTime="2026-03-12 00:11:49.446365586 +0000 UTC m=+200.049781896" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.447332 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dpsl" podStartSLOduration=144.447327355 podStartE2EDuration="2m24.447327355s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.425594329 +0000 UTC m=+200.029010639" watchObservedRunningTime="2026-03-12 00:11:49.447327355 +0000 UTC m=+200.050743665" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.488067 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.488528 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:49.988509557 +0000 UTC m=+200.591925867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.504466 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" event={"ID":"0b9a3066-da46-44d9-9cae-9b073151540c","Type":"ContainerStarted","Data":"c1517a5797e5e8ab81e771864eb2bbb0a8556b6631aa519ca9f4a5d66eb6131e"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.511892 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6qxr" podStartSLOduration=144.511877551 podStartE2EDuration="2m24.511877551s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.51150788 +0000 UTC m=+200.114924190" watchObservedRunningTime="2026-03-12 00:11:49.511877551 +0000 UTC m=+200.115293861" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.512048 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" podStartSLOduration=144.512044066 podStartE2EDuration="2m24.512044066s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.480636724 +0000 UTC m=+200.084053034" watchObservedRunningTime="2026-03-12 00:11:49.512044066 +0000 UTC m=+200.115460376" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.514290 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" event={"ID":"eb28d2f3-0064-4c7b-a036-1a8080bff91b","Type":"ContainerStarted","Data":"10a686d60868c27fb29d77a39b962506230637975025d243d701690b03180fb0"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.528443 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" event={"ID":"581f86c7-aa5f-4071-bba8-fd537cb93402","Type":"ContainerStarted","Data":"485b09a27a333e66ace6e05c1893deac1fc8fbc1e8089ae4b54c471f9fead617"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.530395 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" event={"ID":"6d0aa413-952c-42dd-9267-a3f29e9bebe1","Type":"ContainerStarted","Data":"57476167575dc25c0671f7efff712a1b0618349da89013d7f5f19f22218dfb08"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.531559 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.542065 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" event={"ID":"fd66140c-ae6e-461a-84a9-597fdb115dd8","Type":"ContainerStarted","Data":"e259949b096c57469ba38bb689329415ae95a808e504e4c0511869b6422c95d9"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.543407 4870 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-tg468 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.543451 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" podUID="6d0aa413-952c-42dd-9267-a3f29e9bebe1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.547664 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" event={"ID":"bfd7abe7-1271-40a0-b011-eb4841fb3c03","Type":"ContainerStarted","Data":"e80cddf3238673c5e52e92f76ccece80290ef212ee3f2a955415bc83eaf9f8a4"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.552812 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" podStartSLOduration=143.552799216 podStartE2EDuration="2m23.552799216s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.551789416 +0000 UTC m=+200.155205726" watchObservedRunningTime="2026-03-12 00:11:49.552799216 +0000 UTC m=+200.156215526" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.576018 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-647f6" event={"ID":"ba1c30d1-c2ba-42ce-82d5-7602956ff030","Type":"ContainerStarted","Data":"4dbfcbefe20645b464926b5a92c957b87636eaf069409b1c986c05ad88f63011"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.589890 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.596274 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.096257567 +0000 UTC m=+200.699673877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.625626 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj" event={"ID":"ee267da0-c1eb-4a5d-80d5-da65c77a7c23","Type":"ContainerStarted","Data":"d37cf310317aa81577b0057cd6c0b61e92a2298996bf8a4479cd8a6015b350e7"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.627195 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554570-l4btp" event={"ID":"cf754ba1-52f1-478d-9b07-1d83e55d3020","Type":"ContainerStarted","Data":"c55289b04b7c5d5e73e1b81f6fae47166153af7e59a8e530f3117b3efb0b2891"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.629243 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" event={"ID":"d2507766-d14c-437b-a485-91563bb9b272","Type":"ContainerStarted","Data":"44c6cced78fe902df436058355f9f76c9150b1a85819f3f49e1f1d59e576b600"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.629268 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" event={"ID":"d2507766-d14c-437b-a485-91563bb9b272","Type":"ContainerStarted","Data":"941988a83d827cde738ceaaad12e3894bdf45ef5b1e7663276f8c5f79269bc04"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.630280 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" event={"ID":"aded5c32-6731-43cc-8701-4d847d663dd2","Type":"ContainerStarted","Data":"dc354a36b35d68b3cdb91cd65d4b8b4a59181e5fedfe921f7548b98a5a7c7302"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.634784 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7vhpt" event={"ID":"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e","Type":"ContainerStarted","Data":"4d0c65cd4ac908bcc2576c3e7e36000d7f7e1df31f885b79b10143caf61f625d"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.637918 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nkq8z" event={"ID":"dd365575-f7d8-45f6-b5fc-ef6069e47374","Type":"ContainerStarted","Data":"12cabbbb80e030f7d86a721b0c596578b402c18991e2a87232b7975f9a4bc6a9"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.647506 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-j5lkw" podStartSLOduration=144.647491158 podStartE2EDuration="2m24.647491158s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.596528785 +0000 UTC m=+200.199945085" watchObservedRunningTime="2026-03-12 00:11:49.647491158 +0000 UTC m=+200.250907468" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.649021 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" event={"ID":"8dc311ee-fc53-4e2b-8b1d-d512f36208cb","Type":"ContainerStarted","Data":"3460b64e493e7876531e13c77100d36db773dcd3ae996dfbe8368da99facd6ab"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.661516 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" event={"ID":"eb913154-2066-4879-9598-0a72095a8a5d","Type":"ContainerStarted","Data":"0616027ac76c3d8b4c9a952e920c71c8e7bb2e847d0c99d14079c9484c9380ab"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.685894 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-647f6" podStartSLOduration=144.685878108 podStartE2EDuration="2m24.685878108s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.648475087 +0000 UTC m=+200.251891397" watchObservedRunningTime="2026-03-12 00:11:49.685878108 +0000 UTC m=+200.289294418" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.694965 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.695105 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.195088651 +0000 UTC m=+200.798504961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.696947 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.697031 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5l4xj" event={"ID":"2554ffbb-61ab-48bc-bc78-05ae8517f40d","Type":"ContainerStarted","Data":"acb1495c024ed6f9a73bc09e0cad8e6e006bcd95f75c38f1b513f7ee51a49871"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.697232 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.697342 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.197334838 +0000 UTC m=+200.800751148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.699841 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" event={"ID":"c83e637c-fad7-47e8-a70b-8871cf03c832","Type":"ContainerStarted","Data":"e4d1c824a2f70e6281abc321fda3fedb3063d16453b8bbe2cdee33b7d1bb2836"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.699881 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" event={"ID":"c83e637c-fad7-47e8-a70b-8871cf03c832","Type":"ContainerStarted","Data":"b553f987ca0e8955c09aa7ed5eb477cb3313457bc7ea243432bd5904bbcc86bf"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.708457 4870 patch_prober.go:28] interesting pod/console-operator-58897d9998-5l4xj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.708499 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5l4xj" podUID="2554ffbb-61ab-48bc-bc78-05ae8517f40d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.712860 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7vhpt" podStartSLOduration=5.7128473589999995 podStartE2EDuration="5.712847359s" podCreationTimestamp="2026-03-12 00:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.687491296 +0000 UTC m=+200.290907596" watchObservedRunningTime="2026-03-12 00:11:49.712847359 +0000 UTC m=+200.316263669" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.714700 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ccgj" podStartSLOduration=144.714695844 podStartE2EDuration="2m24.714695844s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.713736365 +0000 UTC m=+200.317152695" watchObservedRunningTime="2026-03-12 00:11:49.714695844 +0000 UTC m=+200.318112154" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.722269 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" event={"ID":"6a38e148-742e-4f33-a30c-7289fad54acb","Type":"ContainerStarted","Data":"4dd3980d498bd7a5db49e3ddffe63c8536e0a8032881471451790e0413106d5b"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.737353 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" event={"ID":"a67d7087-6ab3-42ff-b5cc-cba186b5b036","Type":"ContainerStarted","Data":"c5e1cfa1151a96572395b7d263d8eece8674bfb150da37b0c52bf5abc8045c68"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.741572 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" event={"ID":"0d1beb33-53de-4a02-acec-735ca52df759","Type":"ContainerStarted","Data":"550381b07097b24bf8381b6182642076e70a1388059321abef074f5cc1383a05"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.758854 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" event={"ID":"596347fa-d520-46af-b25c-860d7c0d91a4","Type":"ContainerStarted","Data":"5185e8f1811f0dc86275c0f9b8c957077869b48365fc53a5f642dc87f7109f46"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.760696 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" event={"ID":"3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a","Type":"ContainerStarted","Data":"5d5344d6daafd2644c017caf7a0380a3af35d30fd5770568ad10143491421ef5"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.806656 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.807441 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.307426697 +0000 UTC m=+200.910842997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.809168 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" event={"ID":"93b8bee4-6475-4c86-abeb-92e555f4e1eb","Type":"ContainerStarted","Data":"b6a5115706cdc4c91d9c53ff8b00944cbe30b0435cf2dcabf5ede12d57bfbdba"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.809688 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5l4xj" podStartSLOduration=144.809666674 podStartE2EDuration="2m24.809666674s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.75395937 +0000 UTC m=+200.357375680" watchObservedRunningTime="2026-03-12 00:11:49.809666674 +0000 UTC m=+200.413082984" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.810464 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-5l64f" podStartSLOduration=144.810459747 podStartE2EDuration="2m24.810459747s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.809299573 +0000 UTC m=+200.412715883" watchObservedRunningTime="2026-03-12 00:11:49.810459747 +0000 UTC m=+200.413876057" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.819750 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" event={"ID":"a0cb447f-792e-4438-b3f2-32bd2f408f03","Type":"ContainerStarted","Data":"dd141dbfe696a62ca5bceecd6f5519459052dea88f92c4af61318c24aedb2732"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.826724 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" event={"ID":"c0465219-4339-46be-90ab-0e4519f19493","Type":"ContainerStarted","Data":"ec59c6334f72d3a94bd65037bd3528bad813540450d6d69f7bacda7078636890"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.832364 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k5ks6" event={"ID":"68521596-9390-4238-b387-99895749ff85","Type":"ContainerStarted","Data":"a34fd6db2349b85780e575768047f4a74d3360c03c5b17385d6b3a4a8e8d1747"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.833539 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" event={"ID":"ca72e460-0d01-4a2d-9796-c3a65dd38aec","Type":"ContainerStarted","Data":"5b521304ca8d1c0387b77f33dc48b97dfe846a89065c18a46376c2fcd3f967a1"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.840506 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvh84" podStartSLOduration=144.840489329 podStartE2EDuration="2m24.840489329s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.833716648 +0000 UTC m=+200.437132958" watchObservedRunningTime="2026-03-12 00:11:49.840489329 +0000 UTC m=+200.443905639" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.844406 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" event={"ID":"1fafe5ce-3f4c-4e99-b673-edd2836d0392","Type":"ContainerStarted","Data":"6c3ac5dcd3c2b8d19ae72e2f72df996c6563cb174a76c3d02793f20d60b0f743"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.866463 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" event={"ID":"33f902cf-f8f8-4895-8238-691ef6d7686d","Type":"ContainerStarted","Data":"feea583777ff89442368e9add7631d27f278f38ebdd5f9c099e4c356c80a630b"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.895855 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wzkc5" podStartSLOduration=144.895835122 podStartE2EDuration="2m24.895835122s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.888469464 +0000 UTC m=+200.491885774" watchObservedRunningTime="2026-03-12 00:11:49.895835122 +0000 UTC m=+200.499251432" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.910165 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:49 crc kubenswrapper[4870]: E0312 00:11:49.911657 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.411645662 +0000 UTC m=+201.015061972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.938788 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" event={"ID":"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185","Type":"ContainerStarted","Data":"399833aa741e8b597dde1b1b3c2a0294b92acefa2f4a1a88fbb76c7ec11dea21"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.938833 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" event={"ID":"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185","Type":"ContainerStarted","Data":"61f281d4f447e8a98ad2c54bb4cba615687226606089fd0d35b58fac5e686896"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.964048 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" event={"ID":"b0d00ae6-5fa7-4282-bb14-37c7f3202784","Type":"ContainerStarted","Data":"75978b7dfd4bbad79f7757cc84ae98fd9679045fb5a6fcd24a74e5a6b5e2e6a9"} Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.965779 4870 patch_prober.go:28] interesting pod/downloads-7954f5f757-h8r8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.965828 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h8r8v" podUID="dae6a345-cb5d-4553-868f-232fc4ec81af" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.966476 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.976901 4870 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-kwb4k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.976964 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" podUID="c3b68206-2dd1-410e-930d-a97b21caddc9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.980321 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:49 crc kubenswrapper[4870]: I0312 00:11:49.983281 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" podStartSLOduration=143.983263898 podStartE2EDuration="2m23.983263898s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:49.975774756 +0000 UTC m=+200.579191076" watchObservedRunningTime="2026-03-12 00:11:49.983263898 +0000 UTC m=+200.586680208" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.010755 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.011237 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.511218788 +0000 UTC m=+201.114635098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.071114 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k84g8" podStartSLOduration=145.071083966 podStartE2EDuration="2m25.071083966s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:50.062748978 +0000 UTC m=+200.666165298" watchObservedRunningTime="2026-03-12 00:11:50.071083966 +0000 UTC m=+200.674500276" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.111534 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" podStartSLOduration=144.111500926 podStartE2EDuration="2m24.111500926s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:50.110810666 +0000 UTC m=+200.714226986" watchObservedRunningTime="2026-03-12 00:11:50.111500926 +0000 UTC m=+200.714917226" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.113739 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.126607 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.626588444 +0000 UTC m=+201.230004814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.148328 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" podStartSLOduration=145.148308619 podStartE2EDuration="2m25.148308619s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:50.14734397 +0000 UTC m=+200.750760280" watchObservedRunningTime="2026-03-12 00:11:50.148308619 +0000 UTC m=+200.751724939" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.174093 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ghs7z" podStartSLOduration=145.174069794 podStartE2EDuration="2m25.174069794s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:50.172747965 +0000 UTC m=+200.776164275" watchObservedRunningTime="2026-03-12 00:11:50.174069794 +0000 UTC m=+200.777486104" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.215374 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.215562 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.715532215 +0000 UTC m=+201.318948525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.215926 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.216302 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.716294488 +0000 UTC m=+201.319710868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.316823 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.316996 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.816967147 +0000 UTC m=+201.420383457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.317347 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.317635 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.817622937 +0000 UTC m=+201.421039247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.420673 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.421018 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.920975135 +0000 UTC m=+201.524391445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.421295 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.421731 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:50.921718248 +0000 UTC m=+201.525134608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.516815 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.518382 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.518545 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.521924 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.522027 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.021998555 +0000 UTC m=+201.625414865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.522236 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.522533 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.022522961 +0000 UTC m=+201.625939271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.625258 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.625459 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.125431897 +0000 UTC m=+201.728848207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.625705 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.626278 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.126250671 +0000 UTC m=+201.729666981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.726517 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.727415 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.227396213 +0000 UTC m=+201.830812523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.828073 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.828599 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.328578258 +0000 UTC m=+201.931994588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.928711 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:50 crc kubenswrapper[4870]: E0312 00:11:50.929015 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.42900272 +0000 UTC m=+202.032419030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.952372 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.970604 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" event={"ID":"33f902cf-f8f8-4895-8238-691ef6d7686d","Type":"ContainerStarted","Data":"97b1c42485ee2d8170a6a9a1d0cc7ea04bb623faf67abd982db472e17197164b"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.973927 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" event={"ID":"e80a24bf-734e-476e-9559-4b1bc913802a","Type":"ContainerStarted","Data":"08c696f521f67dc03d576fa7481b155ffe06d6d1432f2f1438e6fa58a5be0eba"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.975749 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" event={"ID":"eb28d2f3-0064-4c7b-a036-1a8080bff91b","Type":"ContainerStarted","Data":"e8551e96f577171465a569eaab6d84622741ffb5a7447eb9e8e2c7e864daf8f5"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.976317 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.981313 4870 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wdxcq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.981369 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" podUID="eb28d2f3-0064-4c7b-a036-1a8080bff91b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.982899 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" event={"ID":"bfd7abe7-1271-40a0-b011-eb4841fb3c03","Type":"ContainerStarted","Data":"92376b818cddfba69a6ffc735df744438886dc20508b426fc277a53db04c5e62"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.984655 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" event={"ID":"d2507766-d14c-437b-a485-91563bb9b272","Type":"ContainerStarted","Data":"a99553f8b79b64e2e7b46bb19d3960200d6eef313be9cf8e60d84aafe9e5ad51"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.985932 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" event={"ID":"aded5c32-6731-43cc-8701-4d847d663dd2","Type":"ContainerStarted","Data":"bf838d0b3006f049c8dc5eb4585bd200d0a44166c412ea7ec9b69c0e993d938c"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.988363 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gptpv" event={"ID":"b0d00ae6-5fa7-4282-bb14-37c7f3202784","Type":"ContainerStarted","Data":"8a753e65ef4a4936a9f0576a442a94d73e21146628f788b1095baa29e86f450e"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.990345 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" event={"ID":"596347fa-d520-46af-b25c-860d7c0d91a4","Type":"ContainerStarted","Data":"d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.990936 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.992207 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7vhpt" event={"ID":"3b59d5bd-f4ec-4ac0-b1ca-b5a5231a407e","Type":"ContainerStarted","Data":"a8298d2df6c70168b14cc409147202a8694de134ca46a62ffe6e728635dadec2"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.992886 4870 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6znk2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.992920 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.994956 4870 generic.go:334] "Generic (PLEG): container finished" podID="93b8bee4-6475-4c86-abeb-92e555f4e1eb" containerID="b6a5115706cdc4c91d9c53ff8b00944cbe30b0435cf2dcabf5ede12d57bfbdba" exitCode=0 Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.994997 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" event={"ID":"93b8bee4-6475-4c86-abeb-92e555f4e1eb","Type":"ContainerDied","Data":"b6a5115706cdc4c91d9c53ff8b00944cbe30b0435cf2dcabf5ede12d57bfbdba"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.995014 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" event={"ID":"93b8bee4-6475-4c86-abeb-92e555f4e1eb","Type":"ContainerStarted","Data":"79d3852ec142da0496ae290588b4c9a00061be3273706ff59187c5637f1186d9"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.996434 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" event={"ID":"0d1beb33-53de-4a02-acec-735ca52df759","Type":"ContainerStarted","Data":"0464693d1381ac9cbfd8ac3817f8dab1b27c2d6cc4bf322f999e8b7ceb088cac"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.997911 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" event={"ID":"c83e637c-fad7-47e8-a70b-8871cf03c832","Type":"ContainerStarted","Data":"84c57253016cd4d39207f6e74dcfa943df1fe6bcc0e6d5e930cad4a10055934f"} Mar 12 00:11:50 crc kubenswrapper[4870]: I0312 00:11:50.998317 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.008029 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k5ks6" event={"ID":"68521596-9390-4238-b387-99895749ff85","Type":"ContainerStarted","Data":"e9b5a290ae6c61c53d3bc184c267e91774d83096bd6e863b86938e6bd11f8675"} Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.010747 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" event={"ID":"0b9a3066-da46-44d9-9cae-9b073151540c","Type":"ContainerStarted","Data":"0c2005481ca836fc5b0ff321afb9fa298471805343d59c155a47d0d3dfaa25f2"} Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.028788 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" podStartSLOduration=145.028771772 podStartE2EDuration="2m25.028771772s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.028554426 +0000 UTC m=+201.631970736" watchObservedRunningTime="2026-03-12 00:11:51.028771772 +0000 UTC m=+201.632188072" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.030397 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" event={"ID":"eb913154-2066-4879-9598-0a72095a8a5d","Type":"ContainerStarted","Data":"839bf3d7c383224018826fc13f12a4e9a788f12a464302971690e608b6fb2616"} Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.032227 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.032496 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.532485622 +0000 UTC m=+202.135901932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.036978 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" event={"ID":"581f86c7-aa5f-4071-bba8-fd537cb93402","Type":"ContainerStarted","Data":"3962b06fd4fd82ba6bed8ac6402f4198a037d9d03a49c3659740f420671e2cc4"} Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.037579 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.039934 4870 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4kkqt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.039998 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" podUID="581f86c7-aa5f-4071-bba8-fd537cb93402" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.040926 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" event={"ID":"1fafe5ce-3f4c-4e99-b673-edd2836d0392","Type":"ContainerStarted","Data":"24b9910d6670d8492f4b55f9ed6efcefa56c9303233c1c6ddadb0f71a68076d0"} Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.040958 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" event={"ID":"1fafe5ce-3f4c-4e99-b673-edd2836d0392","Type":"ContainerStarted","Data":"4fb86181ad8bc86f98996ee56c191f776873c9c0fcd6288d9e1f716f69f305ea"} Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.043368 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nkq8z" event={"ID":"dd365575-f7d8-45f6-b5fc-ef6069e47374","Type":"ContainerStarted","Data":"03c941a080b3035a40795121aab823b4b6972583eabcc767e7b672cd5b1e72a3"} Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.043401 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nkq8z" event={"ID":"dd365575-f7d8-45f6-b5fc-ef6069e47374","Type":"ContainerStarted","Data":"c114fbaa118d6f3d8da1cf89a4267a31489c70d3580024f136b40bf3997a03ab"} Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.044289 4870 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-kwb4k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.044361 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" podUID="c3b68206-2dd1-410e-930d-a97b21caddc9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.045517 4870 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-tg468 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.045559 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" podUID="6d0aa413-952c-42dd-9267-a3f29e9bebe1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.052298 4870 patch_prober.go:28] interesting pod/console-operator-58897d9998-5l4xj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.052365 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5l4xj" podUID="2554ffbb-61ab-48bc-bc78-05ae8517f40d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.054740 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" podStartSLOduration=145.054723953 podStartE2EDuration="2m25.054723953s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.052653961 +0000 UTC m=+201.656070271" watchObservedRunningTime="2026-03-12 00:11:51.054723953 +0000 UTC m=+201.658140263" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.072337 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqcmc" podStartSLOduration=146.072321205 podStartE2EDuration="2m26.072321205s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.070305865 +0000 UTC m=+201.673722175" watchObservedRunningTime="2026-03-12 00:11:51.072321205 +0000 UTC m=+201.675737515" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.096851 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" podStartSLOduration=145.096834413 podStartE2EDuration="2m25.096834413s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.095879335 +0000 UTC m=+201.699295645" watchObservedRunningTime="2026-03-12 00:11:51.096834413 +0000 UTC m=+201.700250723" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.113330 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5dpd" podStartSLOduration=145.113309452 podStartE2EDuration="2m25.113309452s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.112921121 +0000 UTC m=+201.716337431" watchObservedRunningTime="2026-03-12 00:11:51.113309452 +0000 UTC m=+201.716725762" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.133279 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.135961 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.635942934 +0000 UTC m=+202.239359244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.140566 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zsdq9" podStartSLOduration=146.140549591 podStartE2EDuration="2m26.140549591s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.140551211 +0000 UTC m=+201.743967521" watchObservedRunningTime="2026-03-12 00:11:51.140549591 +0000 UTC m=+201.743965901" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.183172 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k5ks6" podStartSLOduration=7.183128476 podStartE2EDuration="7.183128476s" podCreationTimestamp="2026-03-12 00:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.173101668 +0000 UTC m=+201.776517978" watchObservedRunningTime="2026-03-12 00:11:51.183128476 +0000 UTC m=+201.786544786" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.201075 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5frd2" podStartSLOduration=145.201055928 podStartE2EDuration="2m25.201055928s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.199035188 +0000 UTC m=+201.802451518" watchObservedRunningTime="2026-03-12 00:11:51.201055928 +0000 UTC m=+201.804472238" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.223001 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bvpw" podStartSLOduration=146.222977789 podStartE2EDuration="2m26.222977789s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.216877898 +0000 UTC m=+201.820294228" watchObservedRunningTime="2026-03-12 00:11:51.222977789 +0000 UTC m=+201.826394099" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.237827 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.238304 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.738285423 +0000 UTC m=+202.341701733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.249525 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" podStartSLOduration=145.249502746 podStartE2EDuration="2m25.249502746s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.24829138 +0000 UTC m=+201.851707710" watchObservedRunningTime="2026-03-12 00:11:51.249502746 +0000 UTC m=+201.852919056" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.302735 4870 ???:1] "http: TLS handshake error from 192.168.126.11:35300: no serving certificate available for the kubelet" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.338689 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.338798 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.838776057 +0000 UTC m=+202.442192367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.339223 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.339500 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.839481308 +0000 UTC m=+202.442897618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.395015 4870 ???:1] "http: TLS handshake error from 192.168.126.11:35308: no serving certificate available for the kubelet" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.440573 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.440903 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:51.940889279 +0000 UTC m=+202.544305589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.477471 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.477920 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.481097 4870 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-ht5gd container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.481183 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" podUID="a67d7087-6ab3-42ff-b5cc-cba186b5b036" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.497512 4870 ???:1] "http: TLS handshake error from 192.168.126.11:35318: no serving certificate available for the kubelet" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.518121 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.518210 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.542255 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.542647 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.04262981 +0000 UTC m=+202.646046120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.544313 4870 ???:1] "http: TLS handshake error from 192.168.126.11:35330: no serving certificate available for the kubelet" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.617060 4870 ???:1] "http: TLS handshake error from 192.168.126.11:35344: no serving certificate available for the kubelet" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.643015 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.643383 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.143355431 +0000 UTC m=+202.746771741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.726203 4870 ???:1] "http: TLS handshake error from 192.168.126.11:35352: no serving certificate available for the kubelet" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.745224 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.745587 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.245568026 +0000 UTC m=+202.848984336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.846856 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.847094 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.34706503 +0000 UTC m=+202.950481340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.847207 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.847603 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.347587026 +0000 UTC m=+202.951003336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.906286 4870 ???:1] "http: TLS handshake error from 192.168.126.11:35358: no serving certificate available for the kubelet" Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.948761 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.948921 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.448887294 +0000 UTC m=+203.052303604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:51 crc kubenswrapper[4870]: I0312 00:11:51.949062 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:51 crc kubenswrapper[4870]: E0312 00:11:51.949359 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.449351277 +0000 UTC m=+203.052767587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.050273 4870 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6znk2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.050304 4870 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4kkqt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.050314 4870 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wdxcq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.050327 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" podUID="581f86c7-aa5f-4071-bba8-fd537cb93402" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.050327 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.050361 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" podUID="eb28d2f3-0064-4c7b-a036-1a8080bff91b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.050356 4870 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-tg468 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.050415 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" podUID="6d0aa413-952c-42dd-9267-a3f29e9bebe1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.050711 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.052363 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.552335575 +0000 UTC m=+203.155751965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.068348 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-clrh6" podStartSLOduration=146.06833225 podStartE2EDuration="2m26.06833225s" podCreationTimestamp="2026-03-12 00:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:52.067533507 +0000 UTC m=+202.670949827" watchObservedRunningTime="2026-03-12 00:11:52.06833225 +0000 UTC m=+202.671748560" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.068470 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dcrsg" podStartSLOduration=147.068464614 podStartE2EDuration="2m27.068464614s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:51.269569672 +0000 UTC m=+201.872985982" watchObservedRunningTime="2026-03-12 00:11:52.068464614 +0000 UTC m=+202.671880924" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.102237 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" podStartSLOduration=147.102219857 podStartE2EDuration="2m27.102219857s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:52.099421384 +0000 UTC m=+202.702837694" watchObservedRunningTime="2026-03-12 00:11:52.102219857 +0000 UTC m=+202.705636167" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.121899 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nkq8z" podStartSLOduration=8.12187873 podStartE2EDuration="8.12187873s" podCreationTimestamp="2026-03-12 00:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:52.12151961 +0000 UTC m=+202.724935930" watchObservedRunningTime="2026-03-12 00:11:52.12187873 +0000 UTC m=+202.725295050" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.150966 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-8jsrj" podStartSLOduration=147.150948524 podStartE2EDuration="2m27.150948524s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:52.147749649 +0000 UTC m=+202.751165959" watchObservedRunningTime="2026-03-12 00:11:52.150948524 +0000 UTC m=+202.754364834" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.157778 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.158217 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.658191709 +0000 UTC m=+203.261608069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.259633 4870 ???:1] "http: TLS handshake error from 192.168.126.11:35366: no serving certificate available for the kubelet" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.262507 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.262716 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.762683791 +0000 UTC m=+203.366100101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.262771 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.263351 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.763341731 +0000 UTC m=+203.366758041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.363969 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.364396 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.864379811 +0000 UTC m=+203.467796121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.465489 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.466219 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:52.966203485 +0000 UTC m=+203.569619795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.518361 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.518418 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.567701 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.568100 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.06807769 +0000 UTC m=+203.671494000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.669410 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.669763 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.169751239 +0000 UTC m=+203.773167549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.771000 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.771396 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.271379676 +0000 UTC m=+203.874795986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.871998 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.872365 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.372347284 +0000 UTC m=+203.975763594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.889846 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.944774 4870 ???:1] "http: TLS handshake error from 192.168.126.11:35372: no serving certificate available for the kubelet" Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.973537 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.973729 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.473708204 +0000 UTC m=+204.077124514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:52 crc kubenswrapper[4870]: I0312 00:11:52.973766 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:52 crc kubenswrapper[4870]: E0312 00:11:52.974351 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.474332413 +0000 UTC m=+204.077748713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.069261 4870 generic.go:334] "Generic (PLEG): container finished" podID="7e879f21-c2e5-48f6-ad4b-b86f1e3eb185" containerID="399833aa741e8b597dde1b1b3c2a0294b92acefa2f4a1a88fbb76c7ec11dea21" exitCode=0 Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.069373 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" event={"ID":"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185","Type":"ContainerDied","Data":"399833aa741e8b597dde1b1b3c2a0294b92acefa2f4a1a88fbb76c7ec11dea21"} Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.079146 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.079519 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.579503726 +0000 UTC m=+204.182920036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.093453 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" event={"ID":"e80a24bf-734e-476e-9559-4b1bc913802a","Type":"ContainerStarted","Data":"d8728619d3ab291f479790ac091f3f97ac9eb9ba8d79c20317ed19b04c99c6cd"} Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.095103 4870 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6znk2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.095136 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.104248 4870 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-w8l5k container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.104297 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" podUID="93b8bee4-6475-4c86-abeb-92e555f4e1eb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.130217 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" podStartSLOduration=148.130200321 podStartE2EDuration="2m28.130200321s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:53.127783999 +0000 UTC m=+203.731200309" watchObservedRunningTime="2026-03-12 00:11:53.130200321 +0000 UTC m=+203.733616631" Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.180546 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.181565 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.681552086 +0000 UTC m=+204.284968396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.281757 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.282087 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.782072481 +0000 UTC m=+204.385488791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.383663 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.384399 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.884376398 +0000 UTC m=+204.487792788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.450514 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.485187 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.485583 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:53.985562633 +0000 UTC m=+204.588978953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.521700 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:11:53 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:11:53 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:11:53 crc kubenswrapper[4870]: healthz check failed Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.521772 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.586594 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.587712 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.087681505 +0000 UTC m=+204.691097815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.687519 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.687759 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.187730716 +0000 UTC m=+204.791147026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.687834 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.688171 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.188162299 +0000 UTC m=+204.791578609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.788839 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.788995 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.288956292 +0000 UTC m=+204.892372592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.789194 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.789497 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.289488788 +0000 UTC m=+204.892905098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.890179 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.890448 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.390412524 +0000 UTC m=+204.993828834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.890564 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.890971 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.3909449 +0000 UTC m=+204.994361210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.991660 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.991875 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.491842446 +0000 UTC m=+205.095258766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:53 crc kubenswrapper[4870]: I0312 00:11:53.991956 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:53 crc kubenswrapper[4870]: E0312 00:11:53.992299 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.49229092 +0000 UTC m=+205.095707280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.092657 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.092807 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.592779293 +0000 UTC m=+205.196195603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.093206 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.093502 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.593490675 +0000 UTC m=+205.196906985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.141762 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" event={"ID":"ca72e460-0d01-4a2d-9796-c3a65dd38aec","Type":"ContainerStarted","Data":"856b983b276e564444ec88c75fe992e3be07d211512432742cb530d70c39aaf2"} Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.194881 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.195155 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.695106242 +0000 UTC m=+205.298522562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.195375 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.195779 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.695760141 +0000 UTC m=+205.299176451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.297006 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.297203 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.797175042 +0000 UTC m=+205.400591352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.302850 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.303585 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.803563791 +0000 UTC m=+205.406980101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.378623 4870 ???:1] "http: TLS handshake error from 192.168.126.11:35382: no serving certificate available for the kubelet" Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.403558 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.403860 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:54.903845279 +0000 UTC m=+205.507261589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.508162 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.508800 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.008787445 +0000 UTC m=+205.612203755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.526320 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:11:54 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:11:54 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:11:54 crc kubenswrapper[4870]: healthz check failed Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.526387 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.609610 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.609947 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.109932709 +0000 UTC m=+205.713349019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.621822 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.711114 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-config-volume\") pod \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.711181 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jfpx\" (UniqueName: \"kubernetes.io/projected/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-kube-api-access-9jfpx\") pod \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.711386 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-secret-volume\") pod \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\" (UID: \"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185\") " Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.711636 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.711948 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.211936637 +0000 UTC m=+205.815352937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.713080 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-config-volume" (OuterVolumeSpecName: "config-volume") pod "7e879f21-c2e5-48f6-ad4b-b86f1e3eb185" (UID: "7e879f21-c2e5-48f6-ad4b-b86f1e3eb185"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.743970 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7e879f21-c2e5-48f6-ad4b-b86f1e3eb185" (UID: "7e879f21-c2e5-48f6-ad4b-b86f1e3eb185"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.756658 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-kube-api-access-9jfpx" (OuterVolumeSpecName: "kube-api-access-9jfpx") pod "7e879f21-c2e5-48f6-ad4b-b86f1e3eb185" (UID: "7e879f21-c2e5-48f6-ad4b-b86f1e3eb185"). InnerVolumeSpecName "kube-api-access-9jfpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.759174 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kwb4k"] Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.759469 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" podUID="c3b68206-2dd1-410e-930d-a97b21caddc9" containerName="controller-manager" containerID="cri-o://6c029b3048c5208524a0f5c9d8d4cd7514e3ec4d0d5134461415c26c6322ea4d" gracePeriod=30 Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.771726 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.783202 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p"] Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.783391 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" podUID="d8041594-4bbd-408a-b59d-26bb0e17a95e" containerName="route-controller-manager" containerID="cri-o://8fb02b59d562217cfb4bbd1b6e01206d7f4937ae9016911c86f668bed40e4e44" gracePeriod=30 Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.821958 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.822225 4870 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.822240 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.822251 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jfpx\" (UniqueName: \"kubernetes.io/projected/7e879f21-c2e5-48f6-ad4b-b86f1e3eb185-kube-api-access-9jfpx\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.822315 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.322299144 +0000 UTC m=+205.925715454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:54 crc kubenswrapper[4870]: I0312 00:11:54.923411 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:54 crc kubenswrapper[4870]: E0312 00:11:54.923802 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.423784158 +0000 UTC m=+206.027200468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.024072 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.024311 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.524284632 +0000 UTC m=+206.127700942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.024666 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.024951 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.524939002 +0000 UTC m=+206.128355312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.064924 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.065132 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e879f21-c2e5-48f6-ad4b-b86f1e3eb185" containerName="collect-profiles" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.065165 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e879f21-c2e5-48f6-ad4b-b86f1e3eb185" containerName="collect-profiles" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.065274 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e879f21-c2e5-48f6-ad4b-b86f1e3eb185" containerName="collect-profiles" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.065615 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.070606 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.070721 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.096676 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.127016 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.127315 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.627299861 +0000 UTC m=+206.230716171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.228403 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.228484 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90049330-d722-4247-b8da-8bd5ce41ac41-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90049330-d722-4247-b8da-8bd5ce41ac41\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.228517 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.228551 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90049330-d722-4247-b8da-8bd5ce41ac41-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90049330-d722-4247-b8da-8bd5ce41ac41\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.228585 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.228981 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.72896608 +0000 UTC m=+206.332382390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.230034 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.238620 4870 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.242090 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.250781 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m78hv"] Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.252260 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.253944 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.257483 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.261110 4870 generic.go:334] "Generic (PLEG): container finished" podID="d8041594-4bbd-408a-b59d-26bb0e17a95e" containerID="8fb02b59d562217cfb4bbd1b6e01206d7f4937ae9016911c86f668bed40e4e44" exitCode=0 Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.261228 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" event={"ID":"d8041594-4bbd-408a-b59d-26bb0e17a95e","Type":"ContainerDied","Data":"8fb02b59d562217cfb4bbd1b6e01206d7f4937ae9016911c86f668bed40e4e44"} Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.261856 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m78hv"] Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.265500 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.266825 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" event={"ID":"7e879f21-c2e5-48f6-ad4b-b86f1e3eb185","Type":"ContainerDied","Data":"61f281d4f447e8a98ad2c54bb4cba615687226606089fd0d35b58fac5e686896"} Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.266929 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61f281d4f447e8a98ad2c54bb4cba615687226606089fd0d35b58fac5e686896" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.267067 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554560-rqkss" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.280502 4870 generic.go:334] "Generic (PLEG): container finished" podID="c3b68206-2dd1-410e-930d-a97b21caddc9" containerID="6c029b3048c5208524a0f5c9d8d4cd7514e3ec4d0d5134461415c26c6322ea4d" exitCode=0 Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.280927 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" event={"ID":"c3b68206-2dd1-410e-930d-a97b21caddc9","Type":"ContainerDied","Data":"6c029b3048c5208524a0f5c9d8d4cd7514e3ec4d0d5134461415c26c6322ea4d"} Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.287228 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" event={"ID":"ca72e460-0d01-4a2d-9796-c3a65dd38aec","Type":"ContainerStarted","Data":"517b4e39e58fdd6fcb6f851b659b9a743fe82f81900ae3d74fe171440366b43e"} Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.330476 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.330786 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.830750612 +0000 UTC m=+206.434166922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.331074 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.331136 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-utilities\") pod \"certified-operators-m78hv\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.331184 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90049330-d722-4247-b8da-8bd5ce41ac41-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90049330-d722-4247-b8da-8bd5ce41ac41\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.331208 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90049330-d722-4247-b8da-8bd5ce41ac41-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90049330-d722-4247-b8da-8bd5ce41ac41\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.331239 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.331267 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.331284 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-catalog-content\") pod \"certified-operators-m78hv\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.331318 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5v97\" (UniqueName: \"kubernetes.io/projected/61a02593-b52d-470c-967d-565b6fafde45-kube-api-access-q5v97\") pod \"certified-operators-m78hv\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.331336 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.331512 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90049330-d722-4247-b8da-8bd5ce41ac41-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90049330-d722-4247-b8da-8bd5ce41ac41\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.332656 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.832636528 +0000 UTC m=+206.436052838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.335663 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.339905 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.340948 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c62c8d9-0f6b-4ec4-af08-fae75fb41288-metrics-certs\") pod \"network-metrics-daemon-xkrk6\" (UID: \"5c62c8d9-0f6b-4ec4-af08-fae75fb41288\") " pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.353259 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90049330-d722-4247-b8da-8bd5ce41ac41-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90049330-d722-4247-b8da-8bd5ce41ac41\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.363930 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.397423 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.434216 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc6kl\" (UniqueName: \"kubernetes.io/projected/c3b68206-2dd1-410e-930d-a97b21caddc9-kube-api-access-vc6kl\") pod \"c3b68206-2dd1-410e-930d-a97b21caddc9\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.434398 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.434447 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-proxy-ca-bundles\") pod \"c3b68206-2dd1-410e-930d-a97b21caddc9\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.434552 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-client-ca\") pod \"d8041594-4bbd-408a-b59d-26bb0e17a95e\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.434623 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3b68206-2dd1-410e-930d-a97b21caddc9-serving-cert\") pod \"c3b68206-2dd1-410e-930d-a97b21caddc9\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.434845 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.934806032 +0000 UTC m=+206.538222382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.438062 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-client-ca" (OuterVolumeSpecName: "client-ca") pod "d8041594-4bbd-408a-b59d-26bb0e17a95e" (UID: "d8041594-4bbd-408a-b59d-26bb0e17a95e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.440620 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b68206-2dd1-410e-930d-a97b21caddc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c3b68206-2dd1-410e-930d-a97b21caddc9" (UID: "c3b68206-2dd1-410e-930d-a97b21caddc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.443237 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qfz5g"] Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.448563 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c3b68206-2dd1-410e-930d-a97b21caddc9" (UID: "c3b68206-2dd1-410e-930d-a97b21caddc9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.448756 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8041594-4bbd-408a-b59d-26bb0e17a95e-serving-cert\") pod \"d8041594-4bbd-408a-b59d-26bb0e17a95e\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.448812 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-config\") pod \"d8041594-4bbd-408a-b59d-26bb0e17a95e\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.448842 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh92z\" (UniqueName: \"kubernetes.io/projected/d8041594-4bbd-408a-b59d-26bb0e17a95e-kube-api-access-sh92z\") pod \"d8041594-4bbd-408a-b59d-26bb0e17a95e\" (UID: \"d8041594-4bbd-408a-b59d-26bb0e17a95e\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.448902 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-client-ca\") pod \"c3b68206-2dd1-410e-930d-a97b21caddc9\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.449446 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "c3b68206-2dd1-410e-930d-a97b21caddc9" (UID: "c3b68206-2dd1-410e-930d-a97b21caddc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.449893 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-config" (OuterVolumeSpecName: "config") pod "d8041594-4bbd-408a-b59d-26bb0e17a95e" (UID: "d8041594-4bbd-408a-b59d-26bb0e17a95e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.449912 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-config\") pod \"c3b68206-2dd1-410e-930d-a97b21caddc9\" (UID: \"c3b68206-2dd1-410e-930d-a97b21caddc9\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.455186 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b68206-2dd1-410e-930d-a97b21caddc9-kube-api-access-vc6kl" (OuterVolumeSpecName: "kube-api-access-vc6kl") pod "c3b68206-2dd1-410e-930d-a97b21caddc9" (UID: "c3b68206-2dd1-410e-930d-a97b21caddc9"). InnerVolumeSpecName "kube-api-access-vc6kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.456732 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b68206-2dd1-410e-930d-a97b21caddc9" containerName="controller-manager" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.456758 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b68206-2dd1-410e-930d-a97b21caddc9" containerName="controller-manager" Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.456770 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8041594-4bbd-408a-b59d-26bb0e17a95e" containerName="route-controller-manager" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.456777 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8041594-4bbd-408a-b59d-26bb0e17a95e" containerName="route-controller-manager" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.456911 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b68206-2dd1-410e-930d-a97b21caddc9" containerName="controller-manager" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.456931 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8041594-4bbd-408a-b59d-26bb0e17a95e" containerName="route-controller-manager" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.457134 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-config" (OuterVolumeSpecName: "config") pod "c3b68206-2dd1-410e-930d-a97b21caddc9" (UID: "c3b68206-2dd1-410e-930d-a97b21caddc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.457598 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.458566 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8041594-4bbd-408a-b59d-26bb0e17a95e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d8041594-4bbd-408a-b59d-26bb0e17a95e" (UID: "d8041594-4bbd-408a-b59d-26bb0e17a95e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.459899 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8041594-4bbd-408a-b59d-26bb0e17a95e-kube-api-access-sh92z" (OuterVolumeSpecName: "kube-api-access-sh92z") pod "d8041594-4bbd-408a-b59d-26bb0e17a95e" (UID: "d8041594-4bbd-408a-b59d-26bb0e17a95e"). InnerVolumeSpecName "kube-api-access-sh92z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.460398 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-utilities\") pod \"certified-operators-m78hv\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.460550 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.460954 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:55.960941238 +0000 UTC m=+206.564357548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461206 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-catalog-content\") pod \"certified-operators-m78hv\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461290 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5v97\" (UniqueName: \"kubernetes.io/projected/61a02593-b52d-470c-967d-565b6fafde45-kube-api-access-q5v97\") pod \"certified-operators-m78hv\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461409 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3b68206-2dd1-410e-930d-a97b21caddc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461420 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8041594-4bbd-408a-b59d-26bb0e17a95e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461431 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461441 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh92z\" (UniqueName: \"kubernetes.io/projected/d8041594-4bbd-408a-b59d-26bb0e17a95e-kube-api-access-sh92z\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461450 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461458 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461468 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc6kl\" (UniqueName: \"kubernetes.io/projected/c3b68206-2dd1-410e-930d-a97b21caddc9-kube-api-access-vc6kl\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461476 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3b68206-2dd1-410e-930d-a97b21caddc9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.461485 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8041594-4bbd-408a-b59d-26bb0e17a95e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.462277 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-catalog-content\") pod \"certified-operators-m78hv\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.463339 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-utilities\") pod \"certified-operators-m78hv\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.467584 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.476042 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qfz5g"] Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.500209 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5v97\" (UniqueName: \"kubernetes.io/projected/61a02593-b52d-470c-967d-565b6fafde45-kube-api-access-q5v97\") pod \"certified-operators-m78hv\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.524225 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:11:55 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:11:55 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:11:55 crc kubenswrapper[4870]: healthz check failed Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.524278 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.538724 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.564250 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.564652 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk9lr\" (UniqueName: \"kubernetes.io/projected/633cb50d-ccf5-4e3c-a40f-05581c94950e-kube-api-access-sk9lr\") pod \"community-operators-qfz5g\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.564718 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-catalog-content\") pod \"community-operators-qfz5g\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.564746 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-utilities\") pod \"community-operators-qfz5g\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.564909 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:56.064887095 +0000 UTC m=+206.668303405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.569119 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.581684 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkrk6" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.594809 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.632840 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8hh9h"] Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.633837 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.652875 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8hh9h"] Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.666014 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-catalog-content\") pod \"community-operators-qfz5g\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.666058 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-utilities\") pod \"community-operators-qfz5g\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.666086 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.666167 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk9lr\" (UniqueName: \"kubernetes.io/projected/633cb50d-ccf5-4e3c-a40f-05581c94950e-kube-api-access-sk9lr\") pod \"community-operators-qfz5g\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.666986 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-catalog-content\") pod \"community-operators-qfz5g\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.667297 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-utilities\") pod \"community-operators-qfz5g\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.667651 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:56.167633616 +0000 UTC m=+206.771049926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.698222 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk9lr\" (UniqueName: \"kubernetes.io/projected/633cb50d-ccf5-4e3c-a40f-05581c94950e-kube-api-access-sk9lr\") pod \"community-operators-qfz5g\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.767493 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.767839 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 00:11:56.26780129 +0000 UTC m=+206.871217600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.768165 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-catalog-content\") pod \"certified-operators-8hh9h\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.768218 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.768254 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-utilities\") pod \"certified-operators-8hh9h\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.768295 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fx54\" (UniqueName: \"kubernetes.io/projected/3ba3252e-f349-49ce-87d9-64172121150c-kube-api-access-6fx54\") pod \"certified-operators-8hh9h\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: E0312 00:11:55.768748 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 00:11:56.268738148 +0000 UTC m=+206.872154658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c88kv" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.788762 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.797511 4870 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-12T00:11:55.238649527Z","Handler":null,"Name":""} Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.816728 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.848433 4870 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.848520 4870 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.852401 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5gt9r"] Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.857856 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.870504 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.874285 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-catalog-content\") pod \"certified-operators-8hh9h\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.874375 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-utilities\") pod \"certified-operators-8hh9h\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.874418 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fx54\" (UniqueName: \"kubernetes.io/projected/3ba3252e-f349-49ce-87d9-64172121150c-kube-api-access-6fx54\") pod \"certified-operators-8hh9h\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.876304 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-catalog-content\") pod \"certified-operators-8hh9h\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.876550 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-utilities\") pod \"certified-operators-8hh9h\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.885710 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.887346 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5gt9r"] Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.903939 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w8l5k" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.904864 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fx54\" (UniqueName: \"kubernetes.io/projected/3ba3252e-f349-49ce-87d9-64172121150c-kube-api-access-6fx54\") pod \"certified-operators-8hh9h\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.958878 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.976788 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-catalog-content\") pod \"community-operators-5gt9r\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.976867 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfg7k\" (UniqueName: \"kubernetes.io/projected/985b1034-4300-4cdf-a09a-33d70a0ea7b0-kube-api-access-hfg7k\") pod \"community-operators-5gt9r\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.976891 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.976943 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-utilities\") pod \"community-operators-5gt9r\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.984409 4870 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 00:11:55 crc kubenswrapper[4870]: I0312 00:11:55.984464 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.025038 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c88kv\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.078641 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-utilities\") pod \"community-operators-5gt9r\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.078721 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-catalog-content\") pod \"community-operators-5gt9r\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.078767 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfg7k\" (UniqueName: \"kubernetes.io/projected/985b1034-4300-4cdf-a09a-33d70a0ea7b0-kube-api-access-hfg7k\") pod \"community-operators-5gt9r\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.079550 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-utilities\") pod \"community-operators-5gt9r\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.079791 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-catalog-content\") pod \"community-operators-5gt9r\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.122869 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfg7k\" (UniqueName: \"kubernetes.io/projected/985b1034-4300-4cdf-a09a-33d70a0ea7b0-kube-api-access-hfg7k\") pod \"community-operators-5gt9r\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.123131 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.191682 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.236974 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.327531 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" event={"ID":"c3b68206-2dd1-410e-930d-a97b21caddc9","Type":"ContainerDied","Data":"98d3823efa951ecc1afedd07bfe1e8b717e75443d0bd9b72f78e34470b4e3702"} Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.327586 4870 scope.go:117] "RemoveContainer" containerID="6c029b3048c5208524a0f5c9d8d4cd7514e3ec4d0d5134461415c26c6322ea4d" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.327720 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kwb4k" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.333243 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e2aa7c02313d52780f55fda84db529cc2bebfa9a4bd3b00a8a4ec6d93fc063ce"} Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.333280 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2f5ca216bfc6a69b90a4a53e18e6b7bbd6d82d0cd6e0bdcc5623afb91e2d0fb0"} Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.352946 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" event={"ID":"ca72e460-0d01-4a2d-9796-c3a65dd38aec","Type":"ContainerStarted","Data":"a81e1625e5ed62a4e2ee3fa3736619168ae1dbb94827da6b29ba07d6df2d1a91"} Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.352982 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" event={"ID":"ca72e460-0d01-4a2d-9796-c3a65dd38aec","Type":"ContainerStarted","Data":"35d1a69727ae9d58837d070fdf3d60817751ae4b7100e008729c91d34f33d1d5"} Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.360773 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b25800c9d8ea86f9653efff0b702f967239dbcb258e0eafb6f95736635913a44"} Mar 12 00:11:56 crc kubenswrapper[4870]: W0312 00:11:56.371890 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c148d83f8443152625f7b5af764d0de5d1d543edcb6a78835c999ae8f4243ede WatchSource:0}: Error finding container c148d83f8443152625f7b5af764d0de5d1d543edcb6a78835c999ae8f4243ede: Status 404 returned error can't find the container with id c148d83f8443152625f7b5af764d0de5d1d543edcb6a78835c999ae8f4243ede Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.375288 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90049330-d722-4247-b8da-8bd5ce41ac41","Type":"ContainerStarted","Data":"416627a19f0f14db075ed7b4903896f87baf922a43a34fcf900fbc8693045d61"} Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.381474 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.381508 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.393263 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kwb4k"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.406788 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m78hv"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.407073 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" event={"ID":"d8041594-4bbd-408a-b59d-26bb0e17a95e","Type":"ContainerDied","Data":"8c8957f063e26179f47fd70fd1dd02dae3342fd106edd11a67f094216c1ba36d"} Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.407116 4870 scope.go:117] "RemoveContainer" containerID="8fb02b59d562217cfb4bbd1b6e01206d7f4937ae9016911c86f668bed40e4e44" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.407136 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.418261 4870 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lspxp container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]log ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]etcd ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]poststarthook/generic-apiserver-start-informers ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]poststarthook/max-in-flight-filter ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 12 00:11:56 crc kubenswrapper[4870]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 12 00:11:56 crc kubenswrapper[4870]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 12 00:11:56 crc kubenswrapper[4870]: [+]poststarthook/project.openshift.io-projectcache ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]poststarthook/openshift.io-startinformers ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 12 00:11:56 crc kubenswrapper[4870]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 12 00:11:56 crc kubenswrapper[4870]: livez check failed Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.419052 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" podUID="e80a24bf-734e-476e-9559-4b1bc913802a" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.443855 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kwb4k"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.447873 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-z5bcc" podStartSLOduration=12.447853683 podStartE2EDuration="12.447853683s" podCreationTimestamp="2026-03-12 00:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:56.419535442 +0000 UTC m=+207.022951752" watchObservedRunningTime="2026-03-12 00:11:56.447853683 +0000 UTC m=+207.051269993" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.448284 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xkrk6"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.474238 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.475232 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-78r6p"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.483299 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qfz5g"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.491469 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8hh9h"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.495325 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.509323 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht5gd" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.528464 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:11:56 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:11:56 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:11:56 crc kubenswrapper[4870]: healthz check failed Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.528514 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.566549 4870 patch_prober.go:28] interesting pod/downloads-7954f5f757-h8r8v container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.566672 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-h8r8v" podUID="dae6a345-cb5d-4553-868f-232fc4ec81af" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.578587 4870 patch_prober.go:28] interesting pod/downloads-7954f5f757-h8r8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.578655 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h8r8v" podUID="dae6a345-cb5d-4553-868f-232fc4ec81af" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.622734 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.639649 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.661216 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.661573 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.661795 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.662054 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.662476 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.665101 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.677113 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.678941 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.681197 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.683171 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.683285 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.683320 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.683674 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.684402 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.684429 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.686641 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.694386 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.700054 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5gt9r"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.777693 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.780009 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.791025 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c88kv"] Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.793244 4870 patch_prober.go:28] interesting pod/console-f9d7485db-vbgrg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.793313 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vbgrg" podUID="8d26541a-27be-4bb8-99f2-43f63e4729a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.800712 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl7wd\" (UniqueName: \"kubernetes.io/projected/19b0ce3c-f432-48f4-81ed-62cf96995f8d-kube-api-access-kl7wd\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.800813 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfdhs\" (UniqueName: \"kubernetes.io/projected/53febc79-03f6-4672-889c-818fa0b8d11d-kube-api-access-lfdhs\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.800843 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19b0ce3c-f432-48f4-81ed-62cf96995f8d-serving-cert\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.800901 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-client-ca\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.800970 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-proxy-ca-bundles\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.801060 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-config\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.801266 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53febc79-03f6-4672-889c-818fa0b8d11d-serving-cert\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.801295 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-client-ca\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.801414 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-config\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: W0312 00:11:56.836136 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18b4fa2c_97f8_4de2_8d2c_d3fee5338e1a.slice/crio-2c9771370db82e4981e59ecc649924dbd0fa35eaf63836bf90b8109030238397 WatchSource:0}: Error finding container 2c9771370db82e4981e59ecc649924dbd0fa35eaf63836bf90b8109030238397: Status 404 returned error can't find the container with id 2c9771370db82e4981e59ecc649924dbd0fa35eaf63836bf90b8109030238397 Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.903233 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53febc79-03f6-4672-889c-818fa0b8d11d-serving-cert\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.903293 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-client-ca\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.903357 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-config\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.903408 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl7wd\" (UniqueName: \"kubernetes.io/projected/19b0ce3c-f432-48f4-81ed-62cf96995f8d-kube-api-access-kl7wd\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.903466 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdhs\" (UniqueName: \"kubernetes.io/projected/53febc79-03f6-4672-889c-818fa0b8d11d-kube-api-access-lfdhs\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.903501 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19b0ce3c-f432-48f4-81ed-62cf96995f8d-serving-cert\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.903537 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-client-ca\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.903564 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-proxy-ca-bundles\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.903615 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-config\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.904434 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-client-ca\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.905581 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-client-ca\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.905637 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-config\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.906191 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-proxy-ca-bundles\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.909882 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53febc79-03f6-4672-889c-818fa0b8d11d-serving-cert\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.919269 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5l4xj" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.919633 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-config\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.930489 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19b0ce3c-f432-48f4-81ed-62cf96995f8d-serving-cert\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.939946 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl7wd\" (UniqueName: \"kubernetes.io/projected/19b0ce3c-f432-48f4-81ed-62cf96995f8d-kube-api-access-kl7wd\") pod \"route-controller-manager-7c8fc7887f-x5bsr\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.951913 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfdhs\" (UniqueName: \"kubernetes.io/projected/53febc79-03f6-4672-889c-818fa0b8d11d-kube-api-access-lfdhs\") pod \"controller-manager-7c797dd6d5-m8pj6\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.982854 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:56 crc kubenswrapper[4870]: I0312 00:11:56.999187 4870 ???:1] "http: TLS handshake error from 192.168.126.11:37386: no serving certificate available for the kubelet" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.182434 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.219192 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr"] Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.296743 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4kkqt" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.370479 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tg468" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.433104 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qt622"] Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.434039 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.436479 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.444799 4870 generic.go:334] "Generic (PLEG): container finished" podID="61a02593-b52d-470c-967d-565b6fafde45" containerID="8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6" exitCode=0 Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.444890 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m78hv" event={"ID":"61a02593-b52d-470c-967d-565b6fafde45","Type":"ContainerDied","Data":"8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.444917 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m78hv" event={"ID":"61a02593-b52d-470c-967d-565b6fafde45","Type":"ContainerStarted","Data":"636dd20bce29b2c8c2eaaf6858e2b23a112ccdd774e72cdb594503c057c5f8bf"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.447541 4870 generic.go:334] "Generic (PLEG): container finished" podID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerID="1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c" exitCode=0 Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.447635 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qfz5g" event={"ID":"633cb50d-ccf5-4e3c-a40f-05581c94950e","Type":"ContainerDied","Data":"1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.447668 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qfz5g" event={"ID":"633cb50d-ccf5-4e3c-a40f-05581c94950e","Type":"ContainerStarted","Data":"6ef678e27eb1eaa09e4440b9184ccaedf27c60553db3cce52876ec2a4191793f"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.447965 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qt622"] Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.479661 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" event={"ID":"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a","Type":"ContainerStarted","Data":"7fe106341923b90330a49acbf89a3052f930b3f0a5de8e6563bfe0507afd2851"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.479714 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" event={"ID":"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a","Type":"ContainerStarted","Data":"2c9771370db82e4981e59ecc649924dbd0fa35eaf63836bf90b8109030238397"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.480322 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.482548 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6"] Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.487708 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" event={"ID":"5c62c8d9-0f6b-4ec4-af08-fae75fb41288","Type":"ContainerStarted","Data":"6e26d5f540d648db5db7b161307e4f20e9128fe67837359f3c6899ef7c237fad"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.487762 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" event={"ID":"5c62c8d9-0f6b-4ec4-af08-fae75fb41288","Type":"ContainerStarted","Data":"fbc90b1af7b15136e6aadcfca14292cbd06c9b984c4e3da8496db2c2cb3c9ccf"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.515915 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" event={"ID":"19b0ce3c-f432-48f4-81ed-62cf96995f8d","Type":"ContainerStarted","Data":"c6fde54f6eb8bff5e34103bed142261aefba6639f06c92f572b328cd27d8e6a3"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.516472 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.518780 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w975\" (UniqueName: \"kubernetes.io/projected/5c8b915a-17ad-4b09-812f-dea6471a117c-kube-api-access-7w975\") pod \"redhat-marketplace-qt622\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.518933 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-catalog-content\") pod \"redhat-marketplace-qt622\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.518974 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-utilities\") pod \"redhat-marketplace-qt622\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.521595 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:11:57 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:11:57 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:11:57 crc kubenswrapper[4870]: healthz check failed Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.521641 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.530300 4870 generic.go:334] "Generic (PLEG): container finished" podID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerID="b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d" exitCode=0 Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.530428 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5gt9r" event={"ID":"985b1034-4300-4cdf-a09a-33d70a0ea7b0","Type":"ContainerDied","Data":"b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.530465 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5gt9r" event={"ID":"985b1034-4300-4cdf-a09a-33d70a0ea7b0","Type":"ContainerStarted","Data":"07b3bff8d24561e4e8ca049873e7feade2cd9bbb30a978ace906c05fc82bc751"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.540689 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.541550 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.546903 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.568534 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.569421 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.573945 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" podStartSLOduration=152.573934 podStartE2EDuration="2m32.573934s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:57.569139608 +0000 UTC m=+208.172555918" watchObservedRunningTime="2026-03-12 00:11:57.573934 +0000 UTC m=+208.177350310" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.583591 4870 generic.go:334] "Generic (PLEG): container finished" podID="90049330-d722-4247-b8da-8bd5ce41ac41" containerID="811b8d0077f216b43b9b5c455ab2d90710769fd13d34447f9e2251e46ec03b74" exitCode=0 Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.583679 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90049330-d722-4247-b8da-8bd5ce41ac41","Type":"ContainerDied","Data":"811b8d0077f216b43b9b5c455ab2d90710769fd13d34447f9e2251e46ec03b74"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.603836 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"160dbfa929e76be1f338cd3a6452950c6bd9bcd6b5cf89984a4766b2dc8fc9e7"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.603875 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c148d83f8443152625f7b5af764d0de5d1d543edcb6a78835c999ae8f4243ede"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.604184 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.609258 4870 generic.go:334] "Generic (PLEG): container finished" podID="3ba3252e-f349-49ce-87d9-64172121150c" containerID="615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1" exitCode=0 Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.609325 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hh9h" event={"ID":"3ba3252e-f349-49ce-87d9-64172121150c","Type":"ContainerDied","Data":"615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.609349 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hh9h" event={"ID":"3ba3252e-f349-49ce-87d9-64172121150c","Type":"ContainerStarted","Data":"a3376767c9ab4285e9f0b69ae77a4c1d7da1be61193ace45a09c6c034120607b"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.615342 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"528fa30b51b7538b95b070dd9df5ffefadd89d238b1260e16bd769b5384fd15b"} Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.624841 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-catalog-content\") pod \"redhat-marketplace-qt622\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.624941 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-utilities\") pod \"redhat-marketplace-qt622\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.625009 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/721cb307-0939-4ab3-9906-6b98a018088b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"721cb307-0939-4ab3-9906-6b98a018088b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.625084 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w975\" (UniqueName: \"kubernetes.io/projected/5c8b915a-17ad-4b09-812f-dea6471a117c-kube-api-access-7w975\") pod \"redhat-marketplace-qt622\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.625197 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/721cb307-0939-4ab3-9906-6b98a018088b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"721cb307-0939-4ab3-9906-6b98a018088b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.626694 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-catalog-content\") pod \"redhat-marketplace-qt622\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.627333 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-utilities\") pod \"redhat-marketplace-qt622\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.660231 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.663911 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w975\" (UniqueName: \"kubernetes.io/projected/5c8b915a-17ad-4b09-812f-dea6471a117c-kube-api-access-7w975\") pod \"redhat-marketplace-qt622\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.709507 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wdxcq" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.726560 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/721cb307-0939-4ab3-9906-6b98a018088b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"721cb307-0939-4ab3-9906-6b98a018088b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.726740 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/721cb307-0939-4ab3-9906-6b98a018088b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"721cb307-0939-4ab3-9906-6b98a018088b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.728433 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/721cb307-0939-4ab3-9906-6b98a018088b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"721cb307-0939-4ab3-9906-6b98a018088b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.750576 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/721cb307-0939-4ab3-9906-6b98a018088b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"721cb307-0939-4ab3-9906-6b98a018088b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.778491 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.835045 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64trx"] Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.836822 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.842652 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64trx"] Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.930568 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-catalog-content\") pod \"redhat-marketplace-64trx\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.930762 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl2gp\" (UniqueName: \"kubernetes.io/projected/8f61f1af-812a-427f-a392-eec361571de3-kube-api-access-wl2gp\") pod \"redhat-marketplace-64trx\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.930803 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-utilities\") pod \"redhat-marketplace-64trx\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:57 crc kubenswrapper[4870]: I0312 00:11:57.933250 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.032282 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl2gp\" (UniqueName: \"kubernetes.io/projected/8f61f1af-812a-427f-a392-eec361571de3-kube-api-access-wl2gp\") pod \"redhat-marketplace-64trx\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.032330 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-utilities\") pod \"redhat-marketplace-64trx\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.032363 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-catalog-content\") pod \"redhat-marketplace-64trx\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.033126 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-catalog-content\") pod \"redhat-marketplace-64trx\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.033256 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-utilities\") pod \"redhat-marketplace-64trx\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.065408 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl2gp\" (UniqueName: \"kubernetes.io/projected/8f61f1af-812a-427f-a392-eec361571de3-kube-api-access-wl2gp\") pod \"redhat-marketplace-64trx\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.086409 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qt622"] Mar 12 00:11:58 crc kubenswrapper[4870]: W0312 00:11:58.091563 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c8b915a_17ad_4b09_812f_dea6471a117c.slice/crio-58664ed2efb651d27408c09695644532b06548244f67df79c1c84099cf49ddf3 WatchSource:0}: Error finding container 58664ed2efb651d27408c09695644532b06548244f67df79c1c84099cf49ddf3: Status 404 returned error can't find the container with id 58664ed2efb651d27408c09695644532b06548244f67df79c1c84099cf49ddf3 Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.130977 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b68206-2dd1-410e-930d-a97b21caddc9" path="/var/lib/kubelet/pods/c3b68206-2dd1-410e-930d-a97b21caddc9/volumes" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.131962 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8041594-4bbd-408a-b59d-26bb0e17a95e" path="/var/lib/kubelet/pods/d8041594-4bbd-408a-b59d-26bb0e17a95e/volumes" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.156503 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.193423 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.435219 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-78vzj"] Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.437478 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.441720 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-78vzj"] Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.447198 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.523506 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:11:58 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:11:58 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:11:58 crc kubenswrapper[4870]: healthz check failed Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.523610 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.541790 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64trx"] Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.546978 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-catalog-content\") pod \"redhat-operators-78vzj\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.547613 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh8sk\" (UniqueName: \"kubernetes.io/projected/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-kube-api-access-zh8sk\") pod \"redhat-operators-78vzj\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.547639 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-utilities\") pod \"redhat-operators-78vzj\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.649138 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh8sk\" (UniqueName: \"kubernetes.io/projected/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-kube-api-access-zh8sk\") pod \"redhat-operators-78vzj\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.649208 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-utilities\") pod \"redhat-operators-78vzj\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.649286 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-catalog-content\") pod \"redhat-operators-78vzj\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.649849 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-catalog-content\") pod \"redhat-operators-78vzj\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.650447 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-utilities\") pod \"redhat-operators-78vzj\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.668578 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" event={"ID":"53febc79-03f6-4672-889c-818fa0b8d11d","Type":"ContainerStarted","Data":"95ba9012bb4575865412ecbd08edfec9b191c264fb6d104762d482558fd7fa2d"} Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.668650 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" event={"ID":"53febc79-03f6-4672-889c-818fa0b8d11d","Type":"ContainerStarted","Data":"df83565821410054a336bf3c21192e42145e4f3f3fde02334542dfbbf52cea36"} Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.668912 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.681023 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.689460 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xkrk6" event={"ID":"5c62c8d9-0f6b-4ec4-af08-fae75fb41288","Type":"ContainerStarted","Data":"8ced7d342e08457086a81de263f4d85db28e170bf1ce8de92dcee592f76585b4"} Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.695887 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" event={"ID":"19b0ce3c-f432-48f4-81ed-62cf96995f8d","Type":"ContainerStarted","Data":"6cab520308196aee1fd70460b1e62521eff4b0d8382e1a5277b748e26ed51cc3"} Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.696922 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.704310 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh8sk\" (UniqueName: \"kubernetes.io/projected/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-kube-api-access-zh8sk\") pod \"redhat-operators-78vzj\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.717650 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.730668 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64trx" event={"ID":"8f61f1af-812a-427f-a392-eec361571de3","Type":"ContainerStarted","Data":"3cbd7056487aadf2d3f14496867dbe69cf35d0a88b5b685b0f35adc3eab4eb40"} Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.743661 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" podStartSLOduration=4.743643682 podStartE2EDuration="4.743643682s" podCreationTimestamp="2026-03-12 00:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:58.711370634 +0000 UTC m=+209.314786944" watchObservedRunningTime="2026-03-12 00:11:58.743643682 +0000 UTC m=+209.347059992" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.744738 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xkrk6" podStartSLOduration=153.744732115 podStartE2EDuration="2m33.744732115s" podCreationTimestamp="2026-03-12 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:58.741956572 +0000 UTC m=+209.345372902" watchObservedRunningTime="2026-03-12 00:11:58.744732115 +0000 UTC m=+209.348148425" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.778543 4870 generic.go:334] "Generic (PLEG): container finished" podID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerID="8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8" exitCode=0 Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.778637 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt622" event={"ID":"5c8b915a-17ad-4b09-812f-dea6471a117c","Type":"ContainerDied","Data":"8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8"} Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.778685 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt622" event={"ID":"5c8b915a-17ad-4b09-812f-dea6471a117c","Type":"ContainerStarted","Data":"58664ed2efb651d27408c09695644532b06548244f67df79c1c84099cf49ddf3"} Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.792984 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"721cb307-0939-4ab3-9906-6b98a018088b","Type":"ContainerStarted","Data":"df870478d94a2592b39b92bbbb41c94f2a6d7bd1dc0b358dfbbf491ea698054c"} Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.816521 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.830905 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" podStartSLOduration=3.830879633 podStartE2EDuration="3.830879633s" podCreationTimestamp="2026-03-12 00:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:11:58.829512022 +0000 UTC m=+209.432928332" watchObservedRunningTime="2026-03-12 00:11:58.830879633 +0000 UTC m=+209.434295943" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.855683 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r5h2s"] Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.865424 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.901844 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5h2s"] Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.958971 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct4t7\" (UniqueName: \"kubernetes.io/projected/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-kube-api-access-ct4t7\") pod \"redhat-operators-r5h2s\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.959052 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-catalog-content\") pod \"redhat-operators-r5h2s\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:58 crc kubenswrapper[4870]: I0312 00:11:58.959096 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-utilities\") pod \"redhat-operators-r5h2s\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.068578 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct4t7\" (UniqueName: \"kubernetes.io/projected/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-kube-api-access-ct4t7\") pod \"redhat-operators-r5h2s\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.068676 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-catalog-content\") pod \"redhat-operators-r5h2s\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.068747 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-utilities\") pod \"redhat-operators-r5h2s\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.069704 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-utilities\") pod \"redhat-operators-r5h2s\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.072028 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-catalog-content\") pod \"redhat-operators-r5h2s\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.102863 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct4t7\" (UniqueName: \"kubernetes.io/projected/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-kube-api-access-ct4t7\") pod \"redhat-operators-r5h2s\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.232711 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.316707 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.383736 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90049330-d722-4247-b8da-8bd5ce41ac41-kubelet-dir\") pod \"90049330-d722-4247-b8da-8bd5ce41ac41\" (UID: \"90049330-d722-4247-b8da-8bd5ce41ac41\") " Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.383893 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90049330-d722-4247-b8da-8bd5ce41ac41-kube-api-access\") pod \"90049330-d722-4247-b8da-8bd5ce41ac41\" (UID: \"90049330-d722-4247-b8da-8bd5ce41ac41\") " Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.383883 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90049330-d722-4247-b8da-8bd5ce41ac41-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "90049330-d722-4247-b8da-8bd5ce41ac41" (UID: "90049330-d722-4247-b8da-8bd5ce41ac41"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.384082 4870 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90049330-d722-4247-b8da-8bd5ce41ac41-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.400342 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90049330-d722-4247-b8da-8bd5ce41ac41-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "90049330-d722-4247-b8da-8bd5ce41ac41" (UID: "90049330-d722-4247-b8da-8bd5ce41ac41"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.469068 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nkq8z" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.486772 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-78vzj"] Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.486994 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90049330-d722-4247-b8da-8bd5ce41ac41-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.520926 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:11:59 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:11:59 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:11:59 crc kubenswrapper[4870]: healthz check failed Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.520991 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.718760 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5h2s"] Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.808234 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90049330-d722-4247-b8da-8bd5ce41ac41","Type":"ContainerDied","Data":"416627a19f0f14db075ed7b4903896f87baf922a43a34fcf900fbc8693045d61"} Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.808532 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="416627a19f0f14db075ed7b4903896f87baf922a43a34fcf900fbc8693045d61" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.808456 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.810484 4870 generic.go:334] "Generic (PLEG): container finished" podID="8f61f1af-812a-427f-a392-eec361571de3" containerID="642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae" exitCode=0 Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.810698 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64trx" event={"ID":"8f61f1af-812a-427f-a392-eec361571de3","Type":"ContainerDied","Data":"642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae"} Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.828128 4870 generic.go:334] "Generic (PLEG): container finished" podID="721cb307-0939-4ab3-9906-6b98a018088b" containerID="77e27157c4f0a5a00787b55bccda740552e7672feb97534270be86565670851e" exitCode=0 Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.828219 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"721cb307-0939-4ab3-9906-6b98a018088b","Type":"ContainerDied","Data":"77e27157c4f0a5a00787b55bccda740552e7672feb97534270be86565670851e"} Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.830012 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5h2s" event={"ID":"5a32152f-50ce-4712-8ea4-dc6b72dc6f08","Type":"ContainerStarted","Data":"c1507d8d10e043a4144952f8b20692f160f50f3b9559ea356e9a2e7c3f106e9a"} Mar 12 00:11:59 crc kubenswrapper[4870]: I0312 00:11:59.831045 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-78vzj" event={"ID":"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1","Type":"ContainerStarted","Data":"0c5688b12997332a6343c2f23ca5a1c798282eb9b6b5355bdd6a0c3c17e24a60"} Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.139593 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554572-7fms9"] Mar 12 00:12:00 crc kubenswrapper[4870]: E0312 00:12:00.140015 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90049330-d722-4247-b8da-8bd5ce41ac41" containerName="pruner" Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.140028 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="90049330-d722-4247-b8da-8bd5ce41ac41" containerName="pruner" Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.140174 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="90049330-d722-4247-b8da-8bd5ce41ac41" containerName="pruner" Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.140607 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554572-7fms9" Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.142853 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9fvj8" Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.144935 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554572-7fms9"] Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.194802 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lvsp\" (UniqueName: \"kubernetes.io/projected/ec36d635-25f6-4396-9218-6b5aa2c6809b-kube-api-access-4lvsp\") pod \"auto-csr-approver-29554572-7fms9\" (UID: \"ec36d635-25f6-4396-9218-6b5aa2c6809b\") " pod="openshift-infra/auto-csr-approver-29554572-7fms9" Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.295833 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lvsp\" (UniqueName: \"kubernetes.io/projected/ec36d635-25f6-4396-9218-6b5aa2c6809b-kube-api-access-4lvsp\") pod \"auto-csr-approver-29554572-7fms9\" (UID: \"ec36d635-25f6-4396-9218-6b5aa2c6809b\") " pod="openshift-infra/auto-csr-approver-29554572-7fms9" Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.318014 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lvsp\" (UniqueName: \"kubernetes.io/projected/ec36d635-25f6-4396-9218-6b5aa2c6809b-kube-api-access-4lvsp\") pod \"auto-csr-approver-29554572-7fms9\" (UID: \"ec36d635-25f6-4396-9218-6b5aa2c6809b\") " pod="openshift-infra/auto-csr-approver-29554572-7fms9" Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.466293 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554572-7fms9" Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.520321 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:00 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:00 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:00 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.520378 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.845988 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5h2s" event={"ID":"5a32152f-50ce-4712-8ea4-dc6b72dc6f08","Type":"ContainerDied","Data":"edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4"} Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.845919 4870 generic.go:334] "Generic (PLEG): container finished" podID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerID="edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4" exitCode=0 Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.852526 4870 generic.go:334] "Generic (PLEG): container finished" podID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerID="4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195" exitCode=0 Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.853816 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-78vzj" event={"ID":"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1","Type":"ContainerDied","Data":"4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195"} Mar 12 00:12:00 crc kubenswrapper[4870]: I0312 00:12:00.986817 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554572-7fms9"] Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.229251 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.385017 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.391015 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lspxp" Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.412945 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/721cb307-0939-4ab3-9906-6b98a018088b-kubelet-dir\") pod \"721cb307-0939-4ab3-9906-6b98a018088b\" (UID: \"721cb307-0939-4ab3-9906-6b98a018088b\") " Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.413095 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/721cb307-0939-4ab3-9906-6b98a018088b-kube-api-access\") pod \"721cb307-0939-4ab3-9906-6b98a018088b\" (UID: \"721cb307-0939-4ab3-9906-6b98a018088b\") " Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.413903 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/721cb307-0939-4ab3-9906-6b98a018088b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "721cb307-0939-4ab3-9906-6b98a018088b" (UID: "721cb307-0939-4ab3-9906-6b98a018088b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.439687 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/721cb307-0939-4ab3-9906-6b98a018088b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "721cb307-0939-4ab3-9906-6b98a018088b" (UID: "721cb307-0939-4ab3-9906-6b98a018088b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.516470 4870 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/721cb307-0939-4ab3-9906-6b98a018088b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.516879 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/721cb307-0939-4ab3-9906-6b98a018088b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.524164 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:01 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:01 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:01 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.524221 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.860608 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554572-7fms9" event={"ID":"ec36d635-25f6-4396-9218-6b5aa2c6809b","Type":"ContainerStarted","Data":"f141fc3a69840996a09ec8951037c65bf39aa35fb53ccdcf34b41073db14d5ee"} Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.863912 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.863949 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"721cb307-0939-4ab3-9906-6b98a018088b","Type":"ContainerDied","Data":"df870478d94a2592b39b92bbbb41c94f2a6d7bd1dc0b358dfbbf491ea698054c"} Mar 12 00:12:01 crc kubenswrapper[4870]: I0312 00:12:01.863967 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df870478d94a2592b39b92bbbb41c94f2a6d7bd1dc0b358dfbbf491ea698054c" Mar 12 00:12:02 crc kubenswrapper[4870]: I0312 00:12:02.144652 4870 ???:1] "http: TLS handshake error from 192.168.126.11:37390: no serving certificate available for the kubelet" Mar 12 00:12:02 crc kubenswrapper[4870]: I0312 00:12:02.519022 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:02 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:02 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:02 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:02 crc kubenswrapper[4870]: I0312 00:12:02.519089 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:02 crc kubenswrapper[4870]: I0312 00:12:02.675011 4870 ???:1] "http: TLS handshake error from 192.168.126.11:37392: no serving certificate available for the kubelet" Mar 12 00:12:03 crc kubenswrapper[4870]: I0312 00:12:03.517770 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:03 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:03 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:03 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:03 crc kubenswrapper[4870]: I0312 00:12:03.518150 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:04 crc kubenswrapper[4870]: I0312 00:12:04.518104 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:04 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:04 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:04 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:04 crc kubenswrapper[4870]: I0312 00:12:04.518180 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:05 crc kubenswrapper[4870]: I0312 00:12:05.529984 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:05 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:05 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:05 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:05 crc kubenswrapper[4870]: I0312 00:12:05.530049 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:06 crc kubenswrapper[4870]: I0312 00:12:06.524981 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:06 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:06 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:06 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:06 crc kubenswrapper[4870]: I0312 00:12:06.525081 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:06 crc kubenswrapper[4870]: I0312 00:12:06.563872 4870 patch_prober.go:28] interesting pod/downloads-7954f5f757-h8r8v container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 12 00:12:06 crc kubenswrapper[4870]: I0312 00:12:06.563895 4870 patch_prober.go:28] interesting pod/downloads-7954f5f757-h8r8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 12 00:12:06 crc kubenswrapper[4870]: I0312 00:12:06.563939 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-h8r8v" podUID="dae6a345-cb5d-4553-868f-232fc4ec81af" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 12 00:12:06 crc kubenswrapper[4870]: I0312 00:12:06.563959 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h8r8v" podUID="dae6a345-cb5d-4553-868f-232fc4ec81af" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 12 00:12:06 crc kubenswrapper[4870]: I0312 00:12:06.777928 4870 patch_prober.go:28] interesting pod/console-f9d7485db-vbgrg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 12 00:12:06 crc kubenswrapper[4870]: I0312 00:12:06.778006 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vbgrg" podUID="8d26541a-27be-4bb8-99f2-43f63e4729a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 12 00:12:07 crc kubenswrapper[4870]: I0312 00:12:07.517952 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:07 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:07 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:07 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:07 crc kubenswrapper[4870]: I0312 00:12:07.518054 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:08 crc kubenswrapper[4870]: I0312 00:12:08.527858 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:08 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:08 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:08 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:08 crc kubenswrapper[4870]: I0312 00:12:08.528713 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:09 crc kubenswrapper[4870]: I0312 00:12:09.518908 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:09 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:09 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:09 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:09 crc kubenswrapper[4870]: I0312 00:12:09.519001 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:10 crc kubenswrapper[4870]: I0312 00:12:10.519016 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:10 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:10 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:10 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:10 crc kubenswrapper[4870]: I0312 00:12:10.519094 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:11 crc kubenswrapper[4870]: I0312 00:12:11.518197 4870 patch_prober.go:28] interesting pod/router-default-5444994796-647f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 00:12:11 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Mar 12 00:12:11 crc kubenswrapper[4870]: [+]process-running ok Mar 12 00:12:11 crc kubenswrapper[4870]: healthz check failed Mar 12 00:12:11 crc kubenswrapper[4870]: I0312 00:12:11.518624 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-647f6" podUID="ba1c30d1-c2ba-42ce-82d5-7602956ff030" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 00:12:12 crc kubenswrapper[4870]: I0312 00:12:12.413712 4870 ???:1] "http: TLS handshake error from 192.168.126.11:45416: no serving certificate available for the kubelet" Mar 12 00:12:12 crc kubenswrapper[4870]: I0312 00:12:12.518027 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:12:12 crc kubenswrapper[4870]: I0312 00:12:12.520066 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-647f6" Mar 12 00:12:13 crc kubenswrapper[4870]: E0312 00:12:13.970184 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 12 00:12:13 crc kubenswrapper[4870]: E0312 00:12:13.970612 4870 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 00:12:13 crc kubenswrapper[4870]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 12 00:12:13 crc kubenswrapper[4870]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4ml8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29554570-l4btp_openshift-infra(cf754ba1-52f1-478d-9b07-1d83e55d3020): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 12 00:12:13 crc kubenswrapper[4870]: > logger="UnhandledError" Mar 12 00:12:13 crc kubenswrapper[4870]: E0312 00:12:13.972545 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29554570-l4btp" podUID="cf754ba1-52f1-478d-9b07-1d83e55d3020" Mar 12 00:12:14 crc kubenswrapper[4870]: I0312 00:12:14.253667 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6"] Mar 12 00:12:14 crc kubenswrapper[4870]: I0312 00:12:14.253894 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" podUID="53febc79-03f6-4672-889c-818fa0b8d11d" containerName="controller-manager" containerID="cri-o://95ba9012bb4575865412ecbd08edfec9b191c264fb6d104762d482558fd7fa2d" gracePeriod=30 Mar 12 00:12:14 crc kubenswrapper[4870]: I0312 00:12:14.264116 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr"] Mar 12 00:12:14 crc kubenswrapper[4870]: I0312 00:12:14.264338 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" podUID="19b0ce3c-f432-48f4-81ed-62cf96995f8d" containerName="route-controller-manager" containerID="cri-o://6cab520308196aee1fd70460b1e62521eff4b0d8382e1a5277b748e26ed51cc3" gracePeriod=30 Mar 12 00:12:14 crc kubenswrapper[4870]: I0312 00:12:14.974534 4870 generic.go:334] "Generic (PLEG): container finished" podID="19b0ce3c-f432-48f4-81ed-62cf96995f8d" containerID="6cab520308196aee1fd70460b1e62521eff4b0d8382e1a5277b748e26ed51cc3" exitCode=0 Mar 12 00:12:14 crc kubenswrapper[4870]: I0312 00:12:14.974609 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" event={"ID":"19b0ce3c-f432-48f4-81ed-62cf96995f8d","Type":"ContainerDied","Data":"6cab520308196aee1fd70460b1e62521eff4b0d8382e1a5277b748e26ed51cc3"} Mar 12 00:12:14 crc kubenswrapper[4870]: I0312 00:12:14.977478 4870 generic.go:334] "Generic (PLEG): container finished" podID="53febc79-03f6-4672-889c-818fa0b8d11d" containerID="95ba9012bb4575865412ecbd08edfec9b191c264fb6d104762d482558fd7fa2d" exitCode=0 Mar 12 00:12:14 crc kubenswrapper[4870]: I0312 00:12:14.977561 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" event={"ID":"53febc79-03f6-4672-889c-818fa0b8d11d","Type":"ContainerDied","Data":"95ba9012bb4575865412ecbd08edfec9b191c264fb6d104762d482558fd7fa2d"} Mar 12 00:12:14 crc kubenswrapper[4870]: E0312 00:12:14.979135 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29554570-l4btp" podUID="cf754ba1-52f1-478d-9b07-1d83e55d3020" Mar 12 00:12:16 crc kubenswrapper[4870]: I0312 00:12:16.246696 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:12:16 crc kubenswrapper[4870]: I0312 00:12:16.577848 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-h8r8v" Mar 12 00:12:16 crc kubenswrapper[4870]: I0312 00:12:16.780935 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:12:16 crc kubenswrapper[4870]: I0312 00:12:16.786033 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-vbgrg" Mar 12 00:12:16 crc kubenswrapper[4870]: I0312 00:12:16.984001 4870 patch_prober.go:28] interesting pod/route-controller-manager-7c8fc7887f-x5bsr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 12 00:12:16 crc kubenswrapper[4870]: I0312 00:12:16.984051 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" podUID="19b0ce3c-f432-48f4-81ed-62cf96995f8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 12 00:12:17 crc kubenswrapper[4870]: I0312 00:12:17.183758 4870 patch_prober.go:28] interesting pod/controller-manager-7c797dd6d5-m8pj6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 12 00:12:17 crc kubenswrapper[4870]: I0312 00:12:17.184131 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" podUID="53febc79-03f6-4672-889c-818fa0b8d11d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 12 00:12:17 crc kubenswrapper[4870]: I0312 00:12:17.594457 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:12:17 crc kubenswrapper[4870]: I0312 00:12:17.594516 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:12:22 crc kubenswrapper[4870]: E0312 00:12:22.049206 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 12 00:12:22 crc kubenswrapper[4870]: E0312 00:12:22.049668 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sk9lr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qfz5g_openshift-marketplace(633cb50d-ccf5-4e3c-a40f-05581c94950e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 00:12:22 crc kubenswrapper[4870]: E0312 00:12:22.050863 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qfz5g" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" Mar 12 00:12:24 crc kubenswrapper[4870]: E0312 00:12:24.437220 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qfz5g" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" Mar 12 00:12:25 crc kubenswrapper[4870]: E0312 00:12:24.534796 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 12 00:12:25 crc kubenswrapper[4870]: E0312 00:12:24.535052 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q5v97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-m78hv_openshift-marketplace(61a02593-b52d-470c-967d-565b6fafde45): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 00:12:25 crc kubenswrapper[4870]: E0312 00:12:24.536700 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-m78hv" podUID="61a02593-b52d-470c-967d-565b6fafde45" Mar 12 00:12:25 crc kubenswrapper[4870]: E0312 00:12:25.528765 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 12 00:12:25 crc kubenswrapper[4870]: E0312 00:12:25.528947 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hfg7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5gt9r_openshift-marketplace(985b1034-4300-4cdf-a09a-33d70a0ea7b0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 00:12:25 crc kubenswrapper[4870]: E0312 00:12:25.530542 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5gt9r" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" Mar 12 00:12:25 crc kubenswrapper[4870]: E0312 00:12:25.562541 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 12 00:12:25 crc kubenswrapper[4870]: E0312 00:12:25.562684 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6fx54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8hh9h_openshift-marketplace(3ba3252e-f349-49ce-87d9-64172121150c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 00:12:25 crc kubenswrapper[4870]: E0312 00:12:25.564101 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8hh9h" podUID="3ba3252e-f349-49ce-87d9-64172121150c" Mar 12 00:12:26 crc kubenswrapper[4870]: I0312 00:12:26.060073 4870 generic.go:334] "Generic (PLEG): container finished" podID="6d6a8bb4-df10-46c3-91e6-826e501be09f" containerID="921c157e53260323082f7bc38336cb13365755923e21251a667d9e191cd4367f" exitCode=0 Mar 12 00:12:26 crc kubenswrapper[4870]: I0312 00:12:26.060322 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29554560-f9x27" event={"ID":"6d6a8bb4-df10-46c3-91e6-826e501be09f","Type":"ContainerDied","Data":"921c157e53260323082f7bc38336cb13365755923e21251a667d9e191cd4367f"} Mar 12 00:12:27 crc kubenswrapper[4870]: I0312 00:12:27.709319 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mw8sm" Mar 12 00:12:27 crc kubenswrapper[4870]: I0312 00:12:27.985254 4870 patch_prober.go:28] interesting pod/route-controller-manager-7c8fc7887f-x5bsr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: i/o timeout" start-of-body= Mar 12 00:12:27 crc kubenswrapper[4870]: I0312 00:12:27.985308 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" podUID="19b0ce3c-f432-48f4-81ed-62cf96995f8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: i/o timeout" Mar 12 00:12:28 crc kubenswrapper[4870]: I0312 00:12:28.183792 4870 patch_prober.go:28] interesting pod/controller-manager-7c797dd6d5-m8pj6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 00:12:28 crc kubenswrapper[4870]: I0312 00:12:28.183845 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" podUID="53febc79-03f6-4672-889c-818fa0b8d11d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.108567 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8hh9h" podUID="3ba3252e-f349-49ce-87d9-64172121150c" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.109409 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5gt9r" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.109463 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-m78hv" podUID="61a02593-b52d-470c-967d-565b6fafde45" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.145303 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.145469 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zh8sk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-78vzj_openshift-marketplace(d4b05c20-2025-4ce8-9c10-a31f3e0b20e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.147664 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-78vzj" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.170065 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.200692 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.204410 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b"] Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.204968 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53febc79-03f6-4672-889c-818fa0b8d11d" containerName="controller-manager" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.205004 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="53febc79-03f6-4672-889c-818fa0b8d11d" containerName="controller-manager" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.205013 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b0ce3c-f432-48f4-81ed-62cf96995f8d" containerName="route-controller-manager" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.205021 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b0ce3c-f432-48f4-81ed-62cf96995f8d" containerName="route-controller-manager" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.205038 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="721cb307-0939-4ab3-9906-6b98a018088b" containerName="pruner" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.205044 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="721cb307-0939-4ab3-9906-6b98a018088b" containerName="pruner" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.205179 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="721cb307-0939-4ab3-9906-6b98a018088b" containerName="pruner" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.205192 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="53febc79-03f6-4672-889c-818fa0b8d11d" containerName="controller-manager" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.205205 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b0ce3c-f432-48f4-81ed-62cf96995f8d" containerName="route-controller-manager" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.205820 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.212709 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b"] Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.215105 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29554560-f9x27" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.238840 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.239004 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ct4t7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-r5h2s_openshift-marketplace(5a32152f-50ce-4712-8ea4-dc6b72dc6f08): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 00:12:29 crc kubenswrapper[4870]: E0312 00:12:29.240647 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-r5h2s" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.366713 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-proxy-ca-bundles\") pod \"53febc79-03f6-4672-889c-818fa0b8d11d\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.366768 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53febc79-03f6-4672-889c-818fa0b8d11d-serving-cert\") pod \"53febc79-03f6-4672-889c-818fa0b8d11d\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.366829 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62kkf\" (UniqueName: \"kubernetes.io/projected/6d6a8bb4-df10-46c3-91e6-826e501be09f-kube-api-access-62kkf\") pod \"6d6a8bb4-df10-46c3-91e6-826e501be09f\" (UID: \"6d6a8bb4-df10-46c3-91e6-826e501be09f\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.366871 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-config\") pod \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.366892 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-client-ca\") pod \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.366927 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d6a8bb4-df10-46c3-91e6-826e501be09f-serviceca\") pod \"6d6a8bb4-df10-46c3-91e6-826e501be09f\" (UID: \"6d6a8bb4-df10-46c3-91e6-826e501be09f\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.366956 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19b0ce3c-f432-48f4-81ed-62cf96995f8d-serving-cert\") pod \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.366992 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-config\") pod \"53febc79-03f6-4672-889c-818fa0b8d11d\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.367021 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl7wd\" (UniqueName: \"kubernetes.io/projected/19b0ce3c-f432-48f4-81ed-62cf96995f8d-kube-api-access-kl7wd\") pod \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\" (UID: \"19b0ce3c-f432-48f4-81ed-62cf96995f8d\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.367045 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-client-ca\") pod \"53febc79-03f6-4672-889c-818fa0b8d11d\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.367074 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfdhs\" (UniqueName: \"kubernetes.io/projected/53febc79-03f6-4672-889c-818fa0b8d11d-kube-api-access-lfdhs\") pod \"53febc79-03f6-4672-889c-818fa0b8d11d\" (UID: \"53febc79-03f6-4672-889c-818fa0b8d11d\") " Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.367293 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-config\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.367363 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv88r\" (UniqueName: \"kubernetes.io/projected/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-kube-api-access-mv88r\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.367398 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-client-ca\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.367430 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-serving-cert\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.367456 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "53febc79-03f6-4672-889c-818fa0b8d11d" (UID: "53febc79-03f6-4672-889c-818fa0b8d11d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.367845 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d6a8bb4-df10-46c3-91e6-826e501be09f-serviceca" (OuterVolumeSpecName: "serviceca") pod "6d6a8bb4-df10-46c3-91e6-826e501be09f" (UID: "6d6a8bb4-df10-46c3-91e6-826e501be09f"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.368132 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "19b0ce3c-f432-48f4-81ed-62cf96995f8d" (UID: "19b0ce3c-f432-48f4-81ed-62cf96995f8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.368222 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-config" (OuterVolumeSpecName: "config") pod "19b0ce3c-f432-48f4-81ed-62cf96995f8d" (UID: "19b0ce3c-f432-48f4-81ed-62cf96995f8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.368411 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-client-ca" (OuterVolumeSpecName: "client-ca") pod "53febc79-03f6-4672-889c-818fa0b8d11d" (UID: "53febc79-03f6-4672-889c-818fa0b8d11d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.370102 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-config" (OuterVolumeSpecName: "config") pod "53febc79-03f6-4672-889c-818fa0b8d11d" (UID: "53febc79-03f6-4672-889c-818fa0b8d11d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.373507 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53febc79-03f6-4672-889c-818fa0b8d11d-kube-api-access-lfdhs" (OuterVolumeSpecName: "kube-api-access-lfdhs") pod "53febc79-03f6-4672-889c-818fa0b8d11d" (UID: "53febc79-03f6-4672-889c-818fa0b8d11d"). InnerVolumeSpecName "kube-api-access-lfdhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.373564 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b0ce3c-f432-48f4-81ed-62cf96995f8d-kube-api-access-kl7wd" (OuterVolumeSpecName: "kube-api-access-kl7wd") pod "19b0ce3c-f432-48f4-81ed-62cf96995f8d" (UID: "19b0ce3c-f432-48f4-81ed-62cf96995f8d"). InnerVolumeSpecName "kube-api-access-kl7wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.374795 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d6a8bb4-df10-46c3-91e6-826e501be09f-kube-api-access-62kkf" (OuterVolumeSpecName: "kube-api-access-62kkf") pod "6d6a8bb4-df10-46c3-91e6-826e501be09f" (UID: "6d6a8bb4-df10-46c3-91e6-826e501be09f"). InnerVolumeSpecName "kube-api-access-62kkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.375275 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53febc79-03f6-4672-889c-818fa0b8d11d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "53febc79-03f6-4672-889c-818fa0b8d11d" (UID: "53febc79-03f6-4672-889c-818fa0b8d11d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.385292 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19b0ce3c-f432-48f4-81ed-62cf96995f8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "19b0ce3c-f432-48f4-81ed-62cf96995f8d" (UID: "19b0ce3c-f432-48f4-81ed-62cf96995f8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468410 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv88r\" (UniqueName: \"kubernetes.io/projected/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-kube-api-access-mv88r\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468461 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-client-ca\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468488 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-serving-cert\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468536 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-config\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468578 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62kkf\" (UniqueName: \"kubernetes.io/projected/6d6a8bb4-df10-46c3-91e6-826e501be09f-kube-api-access-62kkf\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468589 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468598 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19b0ce3c-f432-48f4-81ed-62cf96995f8d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468607 4870 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d6a8bb4-df10-46c3-91e6-826e501be09f-serviceca\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468615 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19b0ce3c-f432-48f4-81ed-62cf96995f8d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468623 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468631 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl7wd\" (UniqueName: \"kubernetes.io/projected/19b0ce3c-f432-48f4-81ed-62cf96995f8d-kube-api-access-kl7wd\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468639 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468648 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfdhs\" (UniqueName: \"kubernetes.io/projected/53febc79-03f6-4672-889c-818fa0b8d11d-kube-api-access-lfdhs\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468656 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53febc79-03f6-4672-889c-818fa0b8d11d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.468666 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53febc79-03f6-4672-889c-818fa0b8d11d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.469959 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-config\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.470750 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-client-ca\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.477910 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-serving-cert\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.485316 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv88r\" (UniqueName: \"kubernetes.io/projected/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-kube-api-access-mv88r\") pod \"route-controller-manager-545dc5d7dc-zvv5b\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.545855 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.789715 4870 csr.go:261] certificate signing request csr-9xrmc is approved, waiting to be issued Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.795949 4870 csr.go:257] certificate signing request csr-9xrmc is issued Mar 12 00:12:29 crc kubenswrapper[4870]: I0312 00:12:29.930653 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b"] Mar 12 00:12:29 crc kubenswrapper[4870]: W0312 00:12:29.944979 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c551080_fcc6_4509_95b6_3a9f8cbcabf3.slice/crio-84792f1a263bf7fabf62ff77ca996dd76639ca939ac71170c667cde22b5b3720 WatchSource:0}: Error finding container 84792f1a263bf7fabf62ff77ca996dd76639ca939ac71170c667cde22b5b3720: Status 404 returned error can't find the container with id 84792f1a263bf7fabf62ff77ca996dd76639ca939ac71170c667cde22b5b3720 Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.078982 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29554560-f9x27" event={"ID":"6d6a8bb4-df10-46c3-91e6-826e501be09f","Type":"ContainerDied","Data":"c8ac2f01ff8482880a2e603a551a25e996764fa3711f536e3daea741e9c36256"} Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.079287 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8ac2f01ff8482880a2e603a551a25e996764fa3711f536e3daea741e9c36256" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.079010 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29554560-f9x27" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.079946 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" event={"ID":"19b0ce3c-f432-48f4-81ed-62cf96995f8d","Type":"ContainerDied","Data":"c6fde54f6eb8bff5e34103bed142261aefba6639f06c92f572b328cd27d8e6a3"} Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.079995 4870 scope.go:117] "RemoveContainer" containerID="6cab520308196aee1fd70460b1e62521eff4b0d8382e1a5277b748e26ed51cc3" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.080007 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.081940 4870 generic.go:334] "Generic (PLEG): container finished" podID="8f61f1af-812a-427f-a392-eec361571de3" containerID="37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001" exitCode=0 Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.082007 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64trx" event={"ID":"8f61f1af-812a-427f-a392-eec361571de3","Type":"ContainerDied","Data":"37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001"} Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.084785 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" event={"ID":"9c551080-fcc6-4509-95b6-3a9f8cbcabf3","Type":"ContainerStarted","Data":"e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970"} Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.084819 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" event={"ID":"9c551080-fcc6-4509-95b6-3a9f8cbcabf3","Type":"ContainerStarted","Data":"84792f1a263bf7fabf62ff77ca996dd76639ca939ac71170c667cde22b5b3720"} Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.085022 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.087268 4870 patch_prober.go:28] interesting pod/route-controller-manager-545dc5d7dc-zvv5b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.087306 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" podUID="9c551080-fcc6-4509-95b6-3a9f8cbcabf3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.088510 4870 generic.go:334] "Generic (PLEG): container finished" podID="cf754ba1-52f1-478d-9b07-1d83e55d3020" containerID="b14148f3b729554d1abb5d773802a4d249b27bd29d37af8bdaeb5b80c269258f" exitCode=0 Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.088582 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554570-l4btp" event={"ID":"cf754ba1-52f1-478d-9b07-1d83e55d3020","Type":"ContainerDied","Data":"b14148f3b729554d1abb5d773802a4d249b27bd29d37af8bdaeb5b80c269258f"} Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.095753 4870 generic.go:334] "Generic (PLEG): container finished" podID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerID="ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c" exitCode=0 Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.096012 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt622" event={"ID":"5c8b915a-17ad-4b09-812f-dea6471a117c","Type":"ContainerDied","Data":"ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c"} Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.097481 4870 generic.go:334] "Generic (PLEG): container finished" podID="ec36d635-25f6-4396-9218-6b5aa2c6809b" containerID="f5dfcbb3bab0bb4c82fb75499c7cf6d7e84426362dd5384009293fb99e4f45f6" exitCode=0 Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.097539 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554572-7fms9" event={"ID":"ec36d635-25f6-4396-9218-6b5aa2c6809b","Type":"ContainerDied","Data":"f5dfcbb3bab0bb4c82fb75499c7cf6d7e84426362dd5384009293fb99e4f45f6"} Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.099681 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" event={"ID":"53febc79-03f6-4672-889c-818fa0b8d11d","Type":"ContainerDied","Data":"df83565821410054a336bf3c21192e42145e4f3f3fde02334542dfbbf52cea36"} Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.099740 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.103093 4870 scope.go:117] "RemoveContainer" containerID="95ba9012bb4575865412ecbd08edfec9b191c264fb6d104762d482558fd7fa2d" Mar 12 00:12:30 crc kubenswrapper[4870]: E0312 00:12:30.103533 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-78vzj" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" Mar 12 00:12:30 crc kubenswrapper[4870]: E0312 00:12:30.112587 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-r5h2s" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" Mar 12 00:12:30 crc kubenswrapper[4870]: E0312 00:12:30.182698 4870 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b0ce3c_f432_48f4_81ed_62cf96995f8d.slice/crio-c6fde54f6eb8bff5e34103bed142261aefba6639f06c92f572b328cd27d8e6a3: Error finding container c6fde54f6eb8bff5e34103bed142261aefba6639f06c92f572b328cd27d8e6a3: Status 404 returned error can't find the container with id c6fde54f6eb8bff5e34103bed142261aefba6639f06c92f572b328cd27d8e6a3 Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.202848 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" podStartSLOduration=16.202827299 podStartE2EDuration="16.202827299s" podCreationTimestamp="2026-03-12 00:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:12:30.201651544 +0000 UTC m=+240.805067874" watchObservedRunningTime="2026-03-12 00:12:30.202827299 +0000 UTC m=+240.806243609" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.223593 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6"] Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.226479 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c797dd6d5-m8pj6"] Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.234409 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr"] Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.237311 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8fc7887f-x5bsr"] Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.523993 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 00:12:30 crc kubenswrapper[4870]: E0312 00:12:30.524655 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6a8bb4-df10-46c3-91e6-826e501be09f" containerName="image-pruner" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.524673 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6a8bb4-df10-46c3-91e6-826e501be09f" containerName="image-pruner" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.524817 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d6a8bb4-df10-46c3-91e6-826e501be09f" containerName="image-pruner" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.525332 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.526976 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.531235 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.532001 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.681160 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd66a609-bcf2-4550-8a44-ff14e405f39a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd66a609-bcf2-4550-8a44-ff14e405f39a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.681227 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd66a609-bcf2-4550-8a44-ff14e405f39a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd66a609-bcf2-4550-8a44-ff14e405f39a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.782308 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd66a609-bcf2-4550-8a44-ff14e405f39a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd66a609-bcf2-4550-8a44-ff14e405f39a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.782416 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd66a609-bcf2-4550-8a44-ff14e405f39a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd66a609-bcf2-4550-8a44-ff14e405f39a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.782537 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd66a609-bcf2-4550-8a44-ff14e405f39a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd66a609-bcf2-4550-8a44-ff14e405f39a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.796889 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-30 01:32:47.982325981 +0000 UTC Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.796927 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6313h20m17.185400801s for next certificate rotation Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.801250 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd66a609-bcf2-4550-8a44-ff14e405f39a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd66a609-bcf2-4550-8a44-ff14e405f39a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 00:12:30 crc kubenswrapper[4870]: I0312 00:12:30.847323 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.108265 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64trx" event={"ID":"8f61f1af-812a-427f-a392-eec361571de3","Type":"ContainerStarted","Data":"a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586"} Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.114386 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt622" event={"ID":"5c8b915a-17ad-4b09-812f-dea6471a117c","Type":"ContainerStarted","Data":"e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3"} Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.128491 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.136653 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64trx" podStartSLOduration=3.453159404 podStartE2EDuration="34.136625797s" podCreationTimestamp="2026-03-12 00:11:57 +0000 UTC" firstStartedPulling="2026-03-12 00:11:59.823052184 +0000 UTC m=+210.426468494" lastFinishedPulling="2026-03-12 00:12:30.506518577 +0000 UTC m=+241.109934887" observedRunningTime="2026-03-12 00:12:31.132275128 +0000 UTC m=+241.735691448" watchObservedRunningTime="2026-03-12 00:12:31.136625797 +0000 UTC m=+241.740042117" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.248401 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qt622" podStartSLOduration=2.457606172 podStartE2EDuration="34.248385275s" podCreationTimestamp="2026-03-12 00:11:57 +0000 UTC" firstStartedPulling="2026-03-12 00:11:58.792570365 +0000 UTC m=+209.395986665" lastFinishedPulling="2026-03-12 00:12:30.583349458 +0000 UTC m=+241.186765768" observedRunningTime="2026-03-12 00:12:31.17782717 +0000 UTC m=+241.781243500" watchObservedRunningTime="2026-03-12 00:12:31.248385275 +0000 UTC m=+241.851801585" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.251566 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.383322 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554570-l4btp" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.419279 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d57db598d-b8rds"] Mar 12 00:12:31 crc kubenswrapper[4870]: E0312 00:12:31.421344 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf754ba1-52f1-478d-9b07-1d83e55d3020" containerName="oc" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.421367 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf754ba1-52f1-478d-9b07-1d83e55d3020" containerName="oc" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.421472 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf754ba1-52f1-478d-9b07-1d83e55d3020" containerName="oc" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.421825 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.432456 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.432937 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.433550 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.433746 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.436047 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d57db598d-b8rds"] Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.436526 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.436695 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.437942 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.471694 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554572-7fms9" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.490037 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ml8m\" (UniqueName: \"kubernetes.io/projected/cf754ba1-52f1-478d-9b07-1d83e55d3020-kube-api-access-4ml8m\") pod \"cf754ba1-52f1-478d-9b07-1d83e55d3020\" (UID: \"cf754ba1-52f1-478d-9b07-1d83e55d3020\") " Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.500701 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf754ba1-52f1-478d-9b07-1d83e55d3020-kube-api-access-4ml8m" (OuterVolumeSpecName: "kube-api-access-4ml8m") pod "cf754ba1-52f1-478d-9b07-1d83e55d3020" (UID: "cf754ba1-52f1-478d-9b07-1d83e55d3020"). InnerVolumeSpecName "kube-api-access-4ml8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.591201 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lvsp\" (UniqueName: \"kubernetes.io/projected/ec36d635-25f6-4396-9218-6b5aa2c6809b-kube-api-access-4lvsp\") pod \"ec36d635-25f6-4396-9218-6b5aa2c6809b\" (UID: \"ec36d635-25f6-4396-9218-6b5aa2c6809b\") " Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.591482 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357861b1-111e-4044-9e93-5587d27ecc44-serving-cert\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.591526 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-proxy-ca-bundles\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.591549 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-config\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.591599 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9jcb\" (UniqueName: \"kubernetes.io/projected/357861b1-111e-4044-9e93-5587d27ecc44-kube-api-access-m9jcb\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.591628 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-client-ca\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.591659 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ml8m\" (UniqueName: \"kubernetes.io/projected/cf754ba1-52f1-478d-9b07-1d83e55d3020-kube-api-access-4ml8m\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.595984 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec36d635-25f6-4396-9218-6b5aa2c6809b-kube-api-access-4lvsp" (OuterVolumeSpecName: "kube-api-access-4lvsp") pod "ec36d635-25f6-4396-9218-6b5aa2c6809b" (UID: "ec36d635-25f6-4396-9218-6b5aa2c6809b"). InnerVolumeSpecName "kube-api-access-4lvsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.692691 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9jcb\" (UniqueName: \"kubernetes.io/projected/357861b1-111e-4044-9e93-5587d27ecc44-kube-api-access-m9jcb\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.692765 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-client-ca\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.692821 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357861b1-111e-4044-9e93-5587d27ecc44-serving-cert\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.692849 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-proxy-ca-bundles\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.692889 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-config\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.692957 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lvsp\" (UniqueName: \"kubernetes.io/projected/ec36d635-25f6-4396-9218-6b5aa2c6809b-kube-api-access-4lvsp\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.695176 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-proxy-ca-bundles\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.695500 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-config\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.696107 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-client-ca\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.698238 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357861b1-111e-4044-9e93-5587d27ecc44-serving-cert\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.710447 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9jcb\" (UniqueName: \"kubernetes.io/projected/357861b1-111e-4044-9e93-5587d27ecc44-kube-api-access-m9jcb\") pod \"controller-manager-6d57db598d-b8rds\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:31 crc kubenswrapper[4870]: E0312 00:12:31.715207 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d6a8bb4_df10_46c3_91e6_826e501be09f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice/crio-df83565821410054a336bf3c21192e42145e4f3f3fde02334542dfbbf52cea36\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b0ce3c_f432_48f4_81ed_62cf96995f8d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice\": RecentStats: unable to find data in memory cache]" Mar 12 00:12:31 crc kubenswrapper[4870]: I0312 00:12:31.760323 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.032367 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d57db598d-b8rds"] Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.111597 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b0ce3c-f432-48f4-81ed-62cf96995f8d" path="/var/lib/kubelet/pods/19b0ce3c-f432-48f4-81ed-62cf96995f8d/volumes" Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.112278 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53febc79-03f6-4672-889c-818fa0b8d11d" path="/var/lib/kubelet/pods/53febc79-03f6-4672-889c-818fa0b8d11d/volumes" Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.120334 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554570-l4btp" event={"ID":"cf754ba1-52f1-478d-9b07-1d83e55d3020","Type":"ContainerDied","Data":"c55289b04b7c5d5e73e1b81f6fae47166153af7e59a8e530f3117b3efb0b2891"} Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.120374 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c55289b04b7c5d5e73e1b81f6fae47166153af7e59a8e530f3117b3efb0b2891" Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.120438 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554570-l4btp" Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.124516 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554572-7fms9" event={"ID":"ec36d635-25f6-4396-9218-6b5aa2c6809b","Type":"ContainerDied","Data":"f141fc3a69840996a09ec8951037c65bf39aa35fb53ccdcf34b41073db14d5ee"} Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.124563 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554572-7fms9" Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.124575 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f141fc3a69840996a09ec8951037c65bf39aa35fb53ccdcf34b41073db14d5ee" Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.126051 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bd66a609-bcf2-4550-8a44-ff14e405f39a","Type":"ContainerStarted","Data":"088cda8d7d61379401f3bce3a24060d703e78c4a9f6c17453986450a6d026623"} Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.126083 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bd66a609-bcf2-4550-8a44-ff14e405f39a","Type":"ContainerStarted","Data":"c94b8b4d318b749465a46e876468080867b4dffed986d735228110fabab740fd"} Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.127461 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" event={"ID":"357861b1-111e-4044-9e93-5587d27ecc44","Type":"ContainerStarted","Data":"0cf05383512c64fa4f46c4947d2025cf528fc08fa246ad1faae84f41e804944e"} Mar 12 00:12:32 crc kubenswrapper[4870]: I0312 00:12:32.145281 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.145261537 podStartE2EDuration="2.145261537s" podCreationTimestamp="2026-03-12 00:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:12:32.143593327 +0000 UTC m=+242.747009637" watchObservedRunningTime="2026-03-12 00:12:32.145261537 +0000 UTC m=+242.748677847" Mar 12 00:12:33 crc kubenswrapper[4870]: I0312 00:12:33.135907 4870 generic.go:334] "Generic (PLEG): container finished" podID="bd66a609-bcf2-4550-8a44-ff14e405f39a" containerID="088cda8d7d61379401f3bce3a24060d703e78c4a9f6c17453986450a6d026623" exitCode=0 Mar 12 00:12:33 crc kubenswrapper[4870]: I0312 00:12:33.136084 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bd66a609-bcf2-4550-8a44-ff14e405f39a","Type":"ContainerDied","Data":"088cda8d7d61379401f3bce3a24060d703e78c4a9f6c17453986450a6d026623"} Mar 12 00:12:33 crc kubenswrapper[4870]: I0312 00:12:33.139042 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" event={"ID":"357861b1-111e-4044-9e93-5587d27ecc44","Type":"ContainerStarted","Data":"ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf"} Mar 12 00:12:33 crc kubenswrapper[4870]: I0312 00:12:33.139343 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:33 crc kubenswrapper[4870]: I0312 00:12:33.148411 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:33 crc kubenswrapper[4870]: I0312 00:12:33.174925 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" podStartSLOduration=19.1749044 podStartE2EDuration="19.1749044s" podCreationTimestamp="2026-03-12 00:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:12:33.173246891 +0000 UTC m=+243.776663201" watchObservedRunningTime="2026-03-12 00:12:33.1749044 +0000 UTC m=+243.778320710" Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.278165 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d57db598d-b8rds"] Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.378212 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b"] Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.378462 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" podUID="9c551080-fcc6-4509-95b6-3a9f8cbcabf3" containerName="route-controller-manager" containerID="cri-o://e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970" gracePeriod=30 Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.517350 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.636715 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd66a609-bcf2-4550-8a44-ff14e405f39a-kube-api-access\") pod \"bd66a609-bcf2-4550-8a44-ff14e405f39a\" (UID: \"bd66a609-bcf2-4550-8a44-ff14e405f39a\") " Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.636915 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd66a609-bcf2-4550-8a44-ff14e405f39a-kubelet-dir\") pod \"bd66a609-bcf2-4550-8a44-ff14e405f39a\" (UID: \"bd66a609-bcf2-4550-8a44-ff14e405f39a\") " Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.636987 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd66a609-bcf2-4550-8a44-ff14e405f39a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bd66a609-bcf2-4550-8a44-ff14e405f39a" (UID: "bd66a609-bcf2-4550-8a44-ff14e405f39a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.637444 4870 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd66a609-bcf2-4550-8a44-ff14e405f39a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.646020 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd66a609-bcf2-4550-8a44-ff14e405f39a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bd66a609-bcf2-4550-8a44-ff14e405f39a" (UID: "bd66a609-bcf2-4550-8a44-ff14e405f39a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.738696 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd66a609-bcf2-4550-8a44-ff14e405f39a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.765583 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.941391 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-serving-cert\") pod \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.941459 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv88r\" (UniqueName: \"kubernetes.io/projected/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-kube-api-access-mv88r\") pod \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.941487 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-client-ca\") pod \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.941552 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-config\") pod \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\" (UID: \"9c551080-fcc6-4509-95b6-3a9f8cbcabf3\") " Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.942682 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-config" (OuterVolumeSpecName: "config") pod "9c551080-fcc6-4509-95b6-3a9f8cbcabf3" (UID: "9c551080-fcc6-4509-95b6-3a9f8cbcabf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.942894 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-client-ca" (OuterVolumeSpecName: "client-ca") pod "9c551080-fcc6-4509-95b6-3a9f8cbcabf3" (UID: "9c551080-fcc6-4509-95b6-3a9f8cbcabf3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.946340 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9c551080-fcc6-4509-95b6-3a9f8cbcabf3" (UID: "9c551080-fcc6-4509-95b6-3a9f8cbcabf3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:12:34 crc kubenswrapper[4870]: I0312 00:12:34.946498 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-kube-api-access-mv88r" (OuterVolumeSpecName: "kube-api-access-mv88r") pod "9c551080-fcc6-4509-95b6-3a9f8cbcabf3" (UID: "9c551080-fcc6-4509-95b6-3a9f8cbcabf3"). InnerVolumeSpecName "kube-api-access-mv88r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.043345 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.043380 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv88r\" (UniqueName: \"kubernetes.io/projected/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-kube-api-access-mv88r\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.043389 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.043398 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c551080-fcc6-4509-95b6-3a9f8cbcabf3-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.049462 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7gfg9"] Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.150741 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bd66a609-bcf2-4550-8a44-ff14e405f39a","Type":"ContainerDied","Data":"c94b8b4d318b749465a46e876468080867b4dffed986d735228110fabab740fd"} Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.150764 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.150782 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c94b8b4d318b749465a46e876468080867b4dffed986d735228110fabab740fd" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.151925 4870 generic.go:334] "Generic (PLEG): container finished" podID="9c551080-fcc6-4509-95b6-3a9f8cbcabf3" containerID="e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970" exitCode=0 Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.151959 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" event={"ID":"9c551080-fcc6-4509-95b6-3a9f8cbcabf3","Type":"ContainerDied","Data":"e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970"} Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.151977 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.152005 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b" event={"ID":"9c551080-fcc6-4509-95b6-3a9f8cbcabf3","Type":"ContainerDied","Data":"84792f1a263bf7fabf62ff77ca996dd76639ca939ac71170c667cde22b5b3720"} Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.152024 4870 scope.go:117] "RemoveContainer" containerID="e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.165664 4870 scope.go:117] "RemoveContainer" containerID="e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970" Mar 12 00:12:35 crc kubenswrapper[4870]: E0312 00:12:35.166051 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970\": container with ID starting with e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970 not found: ID does not exist" containerID="e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.166080 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970"} err="failed to get container status \"e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970\": rpc error: code = NotFound desc = could not find container \"e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970\": container with ID starting with e6180ee681256fef5990bc09b16b4ed22c3a8d778cb4dc0710c181dad38e4970 not found: ID does not exist" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.180286 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b"] Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.185463 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-545dc5d7dc-zvv5b"] Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.418971 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td"] Mar 12 00:12:35 crc kubenswrapper[4870]: E0312 00:12:35.419214 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd66a609-bcf2-4550-8a44-ff14e405f39a" containerName="pruner" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.419229 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd66a609-bcf2-4550-8a44-ff14e405f39a" containerName="pruner" Mar 12 00:12:35 crc kubenswrapper[4870]: E0312 00:12:35.419250 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c551080-fcc6-4509-95b6-3a9f8cbcabf3" containerName="route-controller-manager" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.419258 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c551080-fcc6-4509-95b6-3a9f8cbcabf3" containerName="route-controller-manager" Mar 12 00:12:35 crc kubenswrapper[4870]: E0312 00:12:35.419269 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec36d635-25f6-4396-9218-6b5aa2c6809b" containerName="oc" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.419278 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec36d635-25f6-4396-9218-6b5aa2c6809b" containerName="oc" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.419384 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd66a609-bcf2-4550-8a44-ff14e405f39a" containerName="pruner" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.419402 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c551080-fcc6-4509-95b6-3a9f8cbcabf3" containerName="route-controller-manager" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.419412 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec36d635-25f6-4396-9218-6b5aa2c6809b" containerName="oc" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.419790 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.424172 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.424805 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.424886 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.426200 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.426917 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.427073 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.443506 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td"] Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.549937 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb6f990-7335-4874-bea7-23ea9db79850-serving-cert\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.550246 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-client-ca\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.550278 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn786\" (UniqueName: \"kubernetes.io/projected/8cb6f990-7335-4874-bea7-23ea9db79850-kube-api-access-dn786\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.550323 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-config\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.576736 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.651946 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb6f990-7335-4874-bea7-23ea9db79850-serving-cert\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.652037 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-client-ca\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.652064 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn786\" (UniqueName: \"kubernetes.io/projected/8cb6f990-7335-4874-bea7-23ea9db79850-kube-api-access-dn786\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.652112 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-config\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.653482 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-config\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.653755 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-client-ca\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.662989 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb6f990-7335-4874-bea7-23ea9db79850-serving-cert\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.675770 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn786\" (UniqueName: \"kubernetes.io/projected/8cb6f990-7335-4874-bea7-23ea9db79850-kube-api-access-dn786\") pod \"route-controller-manager-6c77b9d598-q87td\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:35 crc kubenswrapper[4870]: I0312 00:12:35.742411 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.113991 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c551080-fcc6-4509-95b6-3a9f8cbcabf3" path="/var/lib/kubelet/pods/9c551080-fcc6-4509-95b6-3a9f8cbcabf3/volumes" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.160655 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td"] Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.166100 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" podUID="357861b1-111e-4044-9e93-5587d27ecc44" containerName="controller-manager" containerID="cri-o://ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf" gracePeriod=30 Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.512690 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.513790 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.568998 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.569334 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.572021 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.672138 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-var-lock\") pod \"installer-9-crc\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.672326 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kube-api-access\") pod \"installer-9-crc\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.672391 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.731166 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.775485 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-var-lock\") pod \"installer-9-crc\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.775537 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kube-api-access\") pod \"installer-9-crc\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.775558 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.775655 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.775688 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-var-lock\") pod \"installer-9-crc\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.793289 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kube-api-access\") pod \"installer-9-crc\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.876490 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357861b1-111e-4044-9e93-5587d27ecc44-serving-cert\") pod \"357861b1-111e-4044-9e93-5587d27ecc44\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.876569 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9jcb\" (UniqueName: \"kubernetes.io/projected/357861b1-111e-4044-9e93-5587d27ecc44-kube-api-access-m9jcb\") pod \"357861b1-111e-4044-9e93-5587d27ecc44\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.876605 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-client-ca\") pod \"357861b1-111e-4044-9e93-5587d27ecc44\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.876656 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-config\") pod \"357861b1-111e-4044-9e93-5587d27ecc44\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.876746 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-proxy-ca-bundles\") pod \"357861b1-111e-4044-9e93-5587d27ecc44\" (UID: \"357861b1-111e-4044-9e93-5587d27ecc44\") " Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.877547 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-client-ca" (OuterVolumeSpecName: "client-ca") pod "357861b1-111e-4044-9e93-5587d27ecc44" (UID: "357861b1-111e-4044-9e93-5587d27ecc44"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.877608 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-config" (OuterVolumeSpecName: "config") pod "357861b1-111e-4044-9e93-5587d27ecc44" (UID: "357861b1-111e-4044-9e93-5587d27ecc44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.878116 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "357861b1-111e-4044-9e93-5587d27ecc44" (UID: "357861b1-111e-4044-9e93-5587d27ecc44"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.880419 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357861b1-111e-4044-9e93-5587d27ecc44-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "357861b1-111e-4044-9e93-5587d27ecc44" (UID: "357861b1-111e-4044-9e93-5587d27ecc44"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.880430 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357861b1-111e-4044-9e93-5587d27ecc44-kube-api-access-m9jcb" (OuterVolumeSpecName: "kube-api-access-m9jcb") pod "357861b1-111e-4044-9e93-5587d27ecc44" (UID: "357861b1-111e-4044-9e93-5587d27ecc44"). InnerVolumeSpecName "kube-api-access-m9jcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.886427 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.980269 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.980297 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.980309 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357861b1-111e-4044-9e93-5587d27ecc44-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.980318 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9jcb\" (UniqueName: \"kubernetes.io/projected/357861b1-111e-4044-9e93-5587d27ecc44-kube-api-access-m9jcb\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:36 crc kubenswrapper[4870]: I0312 00:12:36.980326 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/357861b1-111e-4044-9e93-5587d27ecc44-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.094474 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.172360 4870 generic.go:334] "Generic (PLEG): container finished" podID="357861b1-111e-4044-9e93-5587d27ecc44" containerID="ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf" exitCode=0 Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.172477 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.172489 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" event={"ID":"357861b1-111e-4044-9e93-5587d27ecc44","Type":"ContainerDied","Data":"ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf"} Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.172531 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d57db598d-b8rds" event={"ID":"357861b1-111e-4044-9e93-5587d27ecc44","Type":"ContainerDied","Data":"0cf05383512c64fa4f46c4947d2025cf528fc08fa246ad1faae84f41e804944e"} Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.172549 4870 scope.go:117] "RemoveContainer" containerID="ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.174884 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" event={"ID":"8cb6f990-7335-4874-bea7-23ea9db79850","Type":"ContainerStarted","Data":"2c9d20a581ae7ac4e891144f3266b18621114d256719b155a039a0888e99e0bd"} Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.174917 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" event={"ID":"8cb6f990-7335-4874-bea7-23ea9db79850","Type":"ContainerStarted","Data":"b212de18dc6a0469334541296a92d7adeaf902334df0d0a9fd87029d0cd60680"} Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.176108 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.180643 4870 generic.go:334] "Generic (PLEG): container finished" podID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerID="e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1" exitCode=0 Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.180702 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qfz5g" event={"ID":"633cb50d-ccf5-4e3c-a40f-05581c94950e","Type":"ContainerDied","Data":"e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1"} Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.181545 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.182832 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29fd0efd-a422-4b2d-b22c-f2bda94a368d","Type":"ContainerStarted","Data":"777a189827dd205699559065ca791e65a0fce965345393a9ac81c1684dbd9269"} Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.194327 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" podStartSLOduration=3.19430827 podStartE2EDuration="3.19430827s" podCreationTimestamp="2026-03-12 00:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:12:37.189464366 +0000 UTC m=+247.792880676" watchObservedRunningTime="2026-03-12 00:12:37.19430827 +0000 UTC m=+247.797724570" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.243572 4870 scope.go:117] "RemoveContainer" containerID="ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf" Mar 12 00:12:37 crc kubenswrapper[4870]: E0312 00:12:37.245007 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf\": container with ID starting with ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf not found: ID does not exist" containerID="ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.245044 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf"} err="failed to get container status \"ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf\": rpc error: code = NotFound desc = could not find container \"ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf\": container with ID starting with ae8a13bd322eeba12d54f82e5e8a028eab9cc60e198064951c2491b61b34e7cf not found: ID does not exist" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.248779 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d57db598d-b8rds"] Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.251362 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d57db598d-b8rds"] Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.421964 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78f8f679bb-bpr7s"] Mar 12 00:12:37 crc kubenswrapper[4870]: E0312 00:12:37.422271 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357861b1-111e-4044-9e93-5587d27ecc44" containerName="controller-manager" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.422294 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="357861b1-111e-4044-9e93-5587d27ecc44" containerName="controller-manager" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.422435 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="357861b1-111e-4044-9e93-5587d27ecc44" containerName="controller-manager" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.422937 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.424687 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.424814 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.424867 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.425103 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.434490 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.434548 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.439846 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78f8f679bb-bpr7s"] Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.442024 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.588009 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-config\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.588379 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-client-ca\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.588583 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dmw\" (UniqueName: \"kubernetes.io/projected/4e77934e-1d7b-432c-81a4-a9ede986a0d2-kube-api-access-t6dmw\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.588645 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e77934e-1d7b-432c-81a4-a9ede986a0d2-serving-cert\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.588757 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-proxy-ca-bundles\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.690507 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-config\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.690554 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-client-ca\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.690594 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dmw\" (UniqueName: \"kubernetes.io/projected/4e77934e-1d7b-432c-81a4-a9ede986a0d2-kube-api-access-t6dmw\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.690617 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e77934e-1d7b-432c-81a4-a9ede986a0d2-serving-cert\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.690641 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-proxy-ca-bundles\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.691818 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-client-ca\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.691964 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-proxy-ca-bundles\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.692765 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-config\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.699653 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e77934e-1d7b-432c-81a4-a9ede986a0d2-serving-cert\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.708546 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dmw\" (UniqueName: \"kubernetes.io/projected/4e77934e-1d7b-432c-81a4-a9ede986a0d2-kube-api-access-t6dmw\") pod \"controller-manager-78f8f679bb-bpr7s\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.750875 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.779555 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.779598 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.940622 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:12:37 crc kubenswrapper[4870]: I0312 00:12:37.992080 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78f8f679bb-bpr7s"] Mar 12 00:12:37 crc kubenswrapper[4870]: W0312 00:12:37.995653 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e77934e_1d7b_432c_81a4_a9ede986a0d2.slice/crio-4ab635eb8a9a4408d3bec16740c9d1bb4a97613137ca69a8439678e6a7837431 WatchSource:0}: Error finding container 4ab635eb8a9a4408d3bec16740c9d1bb4a97613137ca69a8439678e6a7837431: Status 404 returned error can't find the container with id 4ab635eb8a9a4408d3bec16740c9d1bb4a97613137ca69a8439678e6a7837431 Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.112572 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="357861b1-111e-4044-9e93-5587d27ecc44" path="/var/lib/kubelet/pods/357861b1-111e-4044-9e93-5587d27ecc44/volumes" Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.157039 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.157199 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.196175 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29fd0efd-a422-4b2d-b22c-f2bda94a368d","Type":"ContainerStarted","Data":"c2518116aabefc160e23c5e45118bc636c9496332ec02bc174106da71fa920d3"} Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.199201 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qfz5g" event={"ID":"633cb50d-ccf5-4e3c-a40f-05581c94950e","Type":"ContainerStarted","Data":"8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab"} Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.202460 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" event={"ID":"4e77934e-1d7b-432c-81a4-a9ede986a0d2","Type":"ContainerStarted","Data":"05317845e35ea424a4d136d08d27cb900b9c7292538c28b35654c6215786b4d4"} Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.202498 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" event={"ID":"4e77934e-1d7b-432c-81a4-a9ede986a0d2","Type":"ContainerStarted","Data":"4ab635eb8a9a4408d3bec16740c9d1bb4a97613137ca69a8439678e6a7837431"} Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.202515 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.204026 4870 patch_prober.go:28] interesting pod/controller-manager-78f8f679bb-bpr7s container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.204058 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" podUID="4e77934e-1d7b-432c-81a4-a9ede986a0d2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.212242 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.212224194 podStartE2EDuration="2.212224194s" podCreationTimestamp="2026-03-12 00:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:12:38.211517253 +0000 UTC m=+248.814933563" watchObservedRunningTime="2026-03-12 00:12:38.212224194 +0000 UTC m=+248.815640494" Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.213266 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.235900 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qfz5g" podStartSLOduration=3.066552588 podStartE2EDuration="43.235881087s" podCreationTimestamp="2026-03-12 00:11:55 +0000 UTC" firstStartedPulling="2026-03-12 00:11:57.454966668 +0000 UTC m=+208.058382978" lastFinishedPulling="2026-03-12 00:12:37.624295167 +0000 UTC m=+248.227711477" observedRunningTime="2026-03-12 00:12:38.232211628 +0000 UTC m=+248.835627938" watchObservedRunningTime="2026-03-12 00:12:38.235881087 +0000 UTC m=+248.839297397" Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.247009 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:12:38 crc kubenswrapper[4870]: I0312 00:12:38.254611 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" podStartSLOduration=4.254595403 podStartE2EDuration="4.254595403s" podCreationTimestamp="2026-03-12 00:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:12:38.25115514 +0000 UTC m=+248.854571450" watchObservedRunningTime="2026-03-12 00:12:38.254595403 +0000 UTC m=+248.858011713" Mar 12 00:12:39 crc kubenswrapper[4870]: I0312 00:12:39.213321 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:39 crc kubenswrapper[4870]: I0312 00:12:39.256943 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:12:40 crc kubenswrapper[4870]: I0312 00:12:40.637219 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64trx"] Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.223712 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hh9h" event={"ID":"3ba3252e-f349-49ce-87d9-64172121150c","Type":"ContainerStarted","Data":"34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c"} Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.223810 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64trx" podUID="8f61f1af-812a-427f-a392-eec361571de3" containerName="registry-server" containerID="cri-o://a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586" gracePeriod=2 Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.682966 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:12:41 crc kubenswrapper[4870]: E0312 00:12:41.844705 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice/crio-df83565821410054a336bf3c21192e42145e4f3f3fde02334542dfbbf52cea36\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d6a8bb4_df10_46c3_91e6_826e501be09f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b0ce3c_f432_48f4_81ed_62cf96995f8d.slice\": RecentStats: unable to find data in memory cache]" Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.864361 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-utilities\") pod \"8f61f1af-812a-427f-a392-eec361571de3\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.864747 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl2gp\" (UniqueName: \"kubernetes.io/projected/8f61f1af-812a-427f-a392-eec361571de3-kube-api-access-wl2gp\") pod \"8f61f1af-812a-427f-a392-eec361571de3\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.864793 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-catalog-content\") pod \"8f61f1af-812a-427f-a392-eec361571de3\" (UID: \"8f61f1af-812a-427f-a392-eec361571de3\") " Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.866085 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-utilities" (OuterVolumeSpecName: "utilities") pod "8f61f1af-812a-427f-a392-eec361571de3" (UID: "8f61f1af-812a-427f-a392-eec361571de3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.873404 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f61f1af-812a-427f-a392-eec361571de3-kube-api-access-wl2gp" (OuterVolumeSpecName: "kube-api-access-wl2gp") pod "8f61f1af-812a-427f-a392-eec361571de3" (UID: "8f61f1af-812a-427f-a392-eec361571de3"). InnerVolumeSpecName "kube-api-access-wl2gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.917240 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f61f1af-812a-427f-a392-eec361571de3" (UID: "8f61f1af-812a-427f-a392-eec361571de3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.965756 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.965802 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl2gp\" (UniqueName: \"kubernetes.io/projected/8f61f1af-812a-427f-a392-eec361571de3-kube-api-access-wl2gp\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:41 crc kubenswrapper[4870]: I0312 00:12:41.965818 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f61f1af-812a-427f-a392-eec361571de3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.231110 4870 generic.go:334] "Generic (PLEG): container finished" podID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerID="c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc" exitCode=0 Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.231190 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5gt9r" event={"ID":"985b1034-4300-4cdf-a09a-33d70a0ea7b0","Type":"ContainerDied","Data":"c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc"} Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.233983 4870 generic.go:334] "Generic (PLEG): container finished" podID="8f61f1af-812a-427f-a392-eec361571de3" containerID="a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586" exitCode=0 Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.234059 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64trx" event={"ID":"8f61f1af-812a-427f-a392-eec361571de3","Type":"ContainerDied","Data":"a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586"} Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.234083 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64trx" event={"ID":"8f61f1af-812a-427f-a392-eec361571de3","Type":"ContainerDied","Data":"3cbd7056487aadf2d3f14496867dbe69cf35d0a88b5b685b0f35adc3eab4eb40"} Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.234103 4870 scope.go:117] "RemoveContainer" containerID="a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586" Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.234129 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64trx" Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.238767 4870 generic.go:334] "Generic (PLEG): container finished" podID="3ba3252e-f349-49ce-87d9-64172121150c" containerID="34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c" exitCode=0 Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.238799 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hh9h" event={"ID":"3ba3252e-f349-49ce-87d9-64172121150c","Type":"ContainerDied","Data":"34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c"} Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.257785 4870 scope.go:117] "RemoveContainer" containerID="37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001" Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.277231 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64trx"] Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.280761 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64trx"] Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.284695 4870 scope.go:117] "RemoveContainer" containerID="642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae" Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.298944 4870 scope.go:117] "RemoveContainer" containerID="a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586" Mar 12 00:12:42 crc kubenswrapper[4870]: E0312 00:12:42.299410 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586\": container with ID starting with a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586 not found: ID does not exist" containerID="a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586" Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.299463 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586"} err="failed to get container status \"a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586\": rpc error: code = NotFound desc = could not find container \"a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586\": container with ID starting with a1be992088bde34d6b76b44b6198ec75203c83fe07423196c0f129d3c4c25586 not found: ID does not exist" Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.299497 4870 scope.go:117] "RemoveContainer" containerID="37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001" Mar 12 00:12:42 crc kubenswrapper[4870]: E0312 00:12:42.299975 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001\": container with ID starting with 37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001 not found: ID does not exist" containerID="37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001" Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.300022 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001"} err="failed to get container status \"37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001\": rpc error: code = NotFound desc = could not find container \"37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001\": container with ID starting with 37d5c22ca27b95a3197a17beacb4f580849dd45722f91dae5001418c82728001 not found: ID does not exist" Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.300049 4870 scope.go:117] "RemoveContainer" containerID="642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae" Mar 12 00:12:42 crc kubenswrapper[4870]: E0312 00:12:42.300570 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae\": container with ID starting with 642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae not found: ID does not exist" containerID="642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae" Mar 12 00:12:42 crc kubenswrapper[4870]: I0312 00:12:42.300592 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae"} err="failed to get container status \"642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae\": rpc error: code = NotFound desc = could not find container \"642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae\": container with ID starting with 642e3d29759b3848d4d8f2c461f7d3fa3dcba3a3e58f5466540d808e90a8b4ae not found: ID does not exist" Mar 12 00:12:43 crc kubenswrapper[4870]: I0312 00:12:43.249653 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hh9h" event={"ID":"3ba3252e-f349-49ce-87d9-64172121150c","Type":"ContainerStarted","Data":"e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727"} Mar 12 00:12:43 crc kubenswrapper[4870]: I0312 00:12:43.272430 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8hh9h" podStartSLOduration=2.912160513 podStartE2EDuration="48.272406868s" podCreationTimestamp="2026-03-12 00:11:55 +0000 UTC" firstStartedPulling="2026-03-12 00:11:57.61163829 +0000 UTC m=+208.215054600" lastFinishedPulling="2026-03-12 00:12:42.971884645 +0000 UTC m=+253.575300955" observedRunningTime="2026-03-12 00:12:43.26742282 +0000 UTC m=+253.870839130" watchObservedRunningTime="2026-03-12 00:12:43.272406868 +0000 UTC m=+253.875823178" Mar 12 00:12:44 crc kubenswrapper[4870]: I0312 00:12:44.115259 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f61f1af-812a-427f-a392-eec361571de3" path="/var/lib/kubelet/pods/8f61f1af-812a-427f-a392-eec361571de3/volumes" Mar 12 00:12:44 crc kubenswrapper[4870]: I0312 00:12:44.377070 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5gt9r" event={"ID":"985b1034-4300-4cdf-a09a-33d70a0ea7b0","Type":"ContainerStarted","Data":"c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4"} Mar 12 00:12:44 crc kubenswrapper[4870]: I0312 00:12:44.381201 4870 generic.go:334] "Generic (PLEG): container finished" podID="61a02593-b52d-470c-967d-565b6fafde45" containerID="8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce" exitCode=0 Mar 12 00:12:44 crc kubenswrapper[4870]: I0312 00:12:44.381263 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m78hv" event={"ID":"61a02593-b52d-470c-967d-565b6fafde45","Type":"ContainerDied","Data":"8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce"} Mar 12 00:12:44 crc kubenswrapper[4870]: I0312 00:12:44.399936 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5gt9r" podStartSLOduration=3.773563031 podStartE2EDuration="49.399919507s" podCreationTimestamp="2026-03-12 00:11:55 +0000 UTC" firstStartedPulling="2026-03-12 00:11:57.537273682 +0000 UTC m=+208.140689992" lastFinishedPulling="2026-03-12 00:12:43.163630158 +0000 UTC m=+253.767046468" observedRunningTime="2026-03-12 00:12:44.395597659 +0000 UTC m=+254.999013969" watchObservedRunningTime="2026-03-12 00:12:44.399919507 +0000 UTC m=+255.003335817" Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.411603 4870 generic.go:334] "Generic (PLEG): container finished" podID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerID="5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003" exitCode=0 Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.411695 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5h2s" event={"ID":"5a32152f-50ce-4712-8ea4-dc6b72dc6f08","Type":"ContainerDied","Data":"5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003"} Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.414612 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m78hv" event={"ID":"61a02593-b52d-470c-967d-565b6fafde45","Type":"ContainerStarted","Data":"2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2"} Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.466164 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m78hv" podStartSLOduration=2.894100457 podStartE2EDuration="50.466114716s" podCreationTimestamp="2026-03-12 00:11:55 +0000 UTC" firstStartedPulling="2026-03-12 00:11:57.463020367 +0000 UTC m=+208.066436677" lastFinishedPulling="2026-03-12 00:12:45.035034616 +0000 UTC m=+255.638450936" observedRunningTime="2026-03-12 00:12:45.459651454 +0000 UTC m=+256.063067764" watchObservedRunningTime="2026-03-12 00:12:45.466114716 +0000 UTC m=+256.069531026" Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.596278 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.596337 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.790278 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.790370 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.843082 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.959853 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:12:45 crc kubenswrapper[4870]: I0312 00:12:45.960874 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:12:46 crc kubenswrapper[4870]: I0312 00:12:46.013158 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:12:46 crc kubenswrapper[4870]: I0312 00:12:46.192787 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:12:46 crc kubenswrapper[4870]: I0312 00:12:46.192835 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:12:46 crc kubenswrapper[4870]: I0312 00:12:46.238210 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:12:46 crc kubenswrapper[4870]: I0312 00:12:46.423249 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5h2s" event={"ID":"5a32152f-50ce-4712-8ea4-dc6b72dc6f08","Type":"ContainerStarted","Data":"1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300"} Mar 12 00:12:46 crc kubenswrapper[4870]: I0312 00:12:46.425892 4870 generic.go:334] "Generic (PLEG): container finished" podID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerID="5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8" exitCode=0 Mar 12 00:12:46 crc kubenswrapper[4870]: I0312 00:12:46.426512 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-78vzj" event={"ID":"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1","Type":"ContainerDied","Data":"5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8"} Mar 12 00:12:46 crc kubenswrapper[4870]: I0312 00:12:46.446216 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r5h2s" podStartSLOduration=3.320795197 podStartE2EDuration="48.446194908s" podCreationTimestamp="2026-03-12 00:11:58 +0000 UTC" firstStartedPulling="2026-03-12 00:12:00.850717009 +0000 UTC m=+211.454133319" lastFinishedPulling="2026-03-12 00:12:45.97611672 +0000 UTC m=+256.579533030" observedRunningTime="2026-03-12 00:12:46.443942091 +0000 UTC m=+257.047358401" watchObservedRunningTime="2026-03-12 00:12:46.446194908 +0000 UTC m=+257.049611218" Mar 12 00:12:46 crc kubenswrapper[4870]: I0312 00:12:46.472988 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:12:46 crc kubenswrapper[4870]: I0312 00:12:46.647453 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-m78hv" podUID="61a02593-b52d-470c-967d-565b6fafde45" containerName="registry-server" probeResult="failure" output=< Mar 12 00:12:46 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Mar 12 00:12:46 crc kubenswrapper[4870]: > Mar 12 00:12:47 crc kubenswrapper[4870]: I0312 00:12:47.436087 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-78vzj" event={"ID":"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1","Type":"ContainerStarted","Data":"458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3"} Mar 12 00:12:47 crc kubenswrapper[4870]: I0312 00:12:47.455928 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-78vzj" podStartSLOduration=3.399450303 podStartE2EDuration="49.45591049s" podCreationTimestamp="2026-03-12 00:11:58 +0000 UTC" firstStartedPulling="2026-03-12 00:12:00.854312626 +0000 UTC m=+211.457728936" lastFinishedPulling="2026-03-12 00:12:46.910772773 +0000 UTC m=+257.514189123" observedRunningTime="2026-03-12 00:12:47.453370725 +0000 UTC m=+258.056787055" watchObservedRunningTime="2026-03-12 00:12:47.45591049 +0000 UTC m=+258.059326800" Mar 12 00:12:47 crc kubenswrapper[4870]: I0312 00:12:47.594863 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:12:47 crc kubenswrapper[4870]: I0312 00:12:47.595310 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:12:47 crc kubenswrapper[4870]: I0312 00:12:47.595442 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:12:47 crc kubenswrapper[4870]: I0312 00:12:47.596311 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff"} pod="openshift-machine-config-operator/machine-config-daemon-84dfr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 00:12:47 crc kubenswrapper[4870]: I0312 00:12:47.596442 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" containerID="cri-o://9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff" gracePeriod=600 Mar 12 00:12:48 crc kubenswrapper[4870]: I0312 00:12:48.441233 4870 generic.go:334] "Generic (PLEG): container finished" podID="988c0290-1e98-46c8-8253-a4718914b9ef" containerID="9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff" exitCode=0 Mar 12 00:12:48 crc kubenswrapper[4870]: I0312 00:12:48.441279 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerDied","Data":"9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff"} Mar 12 00:12:48 crc kubenswrapper[4870]: I0312 00:12:48.441303 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerStarted","Data":"d8c3058facf9a2b52988623f4f9078fda5941f091e6fa03732a464860e4b1dac"} Mar 12 00:12:48 crc kubenswrapper[4870]: I0312 00:12:48.816793 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:12:48 crc kubenswrapper[4870]: I0312 00:12:48.817247 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:12:49 crc kubenswrapper[4870]: I0312 00:12:49.233912 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:12:49 crc kubenswrapper[4870]: I0312 00:12:49.233978 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:12:49 crc kubenswrapper[4870]: I0312 00:12:49.871920 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-78vzj" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerName="registry-server" probeResult="failure" output=< Mar 12 00:12:49 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Mar 12 00:12:49 crc kubenswrapper[4870]: > Mar 12 00:12:50 crc kubenswrapper[4870]: I0312 00:12:50.282822 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r5h2s" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerName="registry-server" probeResult="failure" output=< Mar 12 00:12:50 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Mar 12 00:12:50 crc kubenswrapper[4870]: > Mar 12 00:12:51 crc kubenswrapper[4870]: E0312 00:12:51.974317 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice/crio-df83565821410054a336bf3c21192e42145e4f3f3fde02334542dfbbf52cea36\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b0ce3c_f432_48f4_81ed_62cf96995f8d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d6a8bb4_df10_46c3_91e6_826e501be09f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice\": RecentStats: unable to find data in memory cache]" Mar 12 00:12:54 crc kubenswrapper[4870]: I0312 00:12:54.289091 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78f8f679bb-bpr7s"] Mar 12 00:12:54 crc kubenswrapper[4870]: I0312 00:12:54.289687 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" podUID="4e77934e-1d7b-432c-81a4-a9ede986a0d2" containerName="controller-manager" containerID="cri-o://05317845e35ea424a4d136d08d27cb900b9c7292538c28b35654c6215786b4d4" gracePeriod=30 Mar 12 00:12:54 crc kubenswrapper[4870]: I0312 00:12:54.312546 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td"] Mar 12 00:12:54 crc kubenswrapper[4870]: I0312 00:12:54.312736 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" podUID="8cb6f990-7335-4874-bea7-23ea9db79850" containerName="route-controller-manager" containerID="cri-o://2c9d20a581ae7ac4e891144f3266b18621114d256719b155a039a0888e99e0bd" gracePeriod=30 Mar 12 00:12:54 crc kubenswrapper[4870]: I0312 00:12:54.474914 4870 generic.go:334] "Generic (PLEG): container finished" podID="8cb6f990-7335-4874-bea7-23ea9db79850" containerID="2c9d20a581ae7ac4e891144f3266b18621114d256719b155a039a0888e99e0bd" exitCode=0 Mar 12 00:12:54 crc kubenswrapper[4870]: I0312 00:12:54.475164 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" event={"ID":"8cb6f990-7335-4874-bea7-23ea9db79850","Type":"ContainerDied","Data":"2c9d20a581ae7ac4e891144f3266b18621114d256719b155a039a0888e99e0bd"} Mar 12 00:12:54 crc kubenswrapper[4870]: I0312 00:12:54.476962 4870 generic.go:334] "Generic (PLEG): container finished" podID="4e77934e-1d7b-432c-81a4-a9ede986a0d2" containerID="05317845e35ea424a4d136d08d27cb900b9c7292538c28b35654c6215786b4d4" exitCode=0 Mar 12 00:12:54 crc kubenswrapper[4870]: I0312 00:12:54.477003 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" event={"ID":"4e77934e-1d7b-432c-81a4-a9ede986a0d2","Type":"ContainerDied","Data":"05317845e35ea424a4d136d08d27cb900b9c7292538c28b35654c6215786b4d4"} Mar 12 00:12:54 crc kubenswrapper[4870]: I0312 00:12:54.914634 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.064426 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.101340 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn786\" (UniqueName: \"kubernetes.io/projected/8cb6f990-7335-4874-bea7-23ea9db79850-kube-api-access-dn786\") pod \"8cb6f990-7335-4874-bea7-23ea9db79850\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.101388 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb6f990-7335-4874-bea7-23ea9db79850-serving-cert\") pod \"8cb6f990-7335-4874-bea7-23ea9db79850\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.101417 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-proxy-ca-bundles\") pod \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.101443 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-client-ca\") pod \"8cb6f990-7335-4874-bea7-23ea9db79850\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.102274 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4e77934e-1d7b-432c-81a4-a9ede986a0d2" (UID: "4e77934e-1d7b-432c-81a4-a9ede986a0d2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.102497 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-config" (OuterVolumeSpecName: "config") pod "8cb6f990-7335-4874-bea7-23ea9db79850" (UID: "8cb6f990-7335-4874-bea7-23ea9db79850"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.101477 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-config\") pod \"8cb6f990-7335-4874-bea7-23ea9db79850\" (UID: \"8cb6f990-7335-4874-bea7-23ea9db79850\") " Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.102563 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-client-ca" (OuterVolumeSpecName: "client-ca") pod "8cb6f990-7335-4874-bea7-23ea9db79850" (UID: "8cb6f990-7335-4874-bea7-23ea9db79850"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.102816 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.102843 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.102854 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb6f990-7335-4874-bea7-23ea9db79850-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.110666 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb6f990-7335-4874-bea7-23ea9db79850-kube-api-access-dn786" (OuterVolumeSpecName: "kube-api-access-dn786") pod "8cb6f990-7335-4874-bea7-23ea9db79850" (UID: "8cb6f990-7335-4874-bea7-23ea9db79850"). InnerVolumeSpecName "kube-api-access-dn786". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.111833 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb6f990-7335-4874-bea7-23ea9db79850-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cb6f990-7335-4874-bea7-23ea9db79850" (UID: "8cb6f990-7335-4874-bea7-23ea9db79850"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.203284 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e77934e-1d7b-432c-81a4-a9ede986a0d2-serving-cert\") pod \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.203378 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6dmw\" (UniqueName: \"kubernetes.io/projected/4e77934e-1d7b-432c-81a4-a9ede986a0d2-kube-api-access-t6dmw\") pod \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.203401 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-client-ca\") pod \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.203434 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-config\") pod \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\" (UID: \"4e77934e-1d7b-432c-81a4-a9ede986a0d2\") " Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.203704 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn786\" (UniqueName: \"kubernetes.io/projected/8cb6f990-7335-4874-bea7-23ea9db79850-kube-api-access-dn786\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.203717 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb6f990-7335-4874-bea7-23ea9db79850-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.206300 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "4e77934e-1d7b-432c-81a4-a9ede986a0d2" (UID: "4e77934e-1d7b-432c-81a4-a9ede986a0d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.206958 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e77934e-1d7b-432c-81a4-a9ede986a0d2-kube-api-access-t6dmw" (OuterVolumeSpecName: "kube-api-access-t6dmw") pod "4e77934e-1d7b-432c-81a4-a9ede986a0d2" (UID: "4e77934e-1d7b-432c-81a4-a9ede986a0d2"). InnerVolumeSpecName "kube-api-access-t6dmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.208114 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-config" (OuterVolumeSpecName: "config") pod "4e77934e-1d7b-432c-81a4-a9ede986a0d2" (UID: "4e77934e-1d7b-432c-81a4-a9ede986a0d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.209881 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e77934e-1d7b-432c-81a4-a9ede986a0d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4e77934e-1d7b-432c-81a4-a9ede986a0d2" (UID: "4e77934e-1d7b-432c-81a4-a9ede986a0d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.304965 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6dmw\" (UniqueName: \"kubernetes.io/projected/4e77934e-1d7b-432c-81a4-a9ede986a0d2-kube-api-access-t6dmw\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.306627 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.306783 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e77934e-1d7b-432c-81a4-a9ede986a0d2-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.307959 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e77934e-1d7b-432c-81a4-a9ede986a0d2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.441830 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp"] Mar 12 00:12:55 crc kubenswrapper[4870]: E0312 00:12:55.442352 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f61f1af-812a-427f-a392-eec361571de3" containerName="extract-utilities" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.442383 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f61f1af-812a-427f-a392-eec361571de3" containerName="extract-utilities" Mar 12 00:12:55 crc kubenswrapper[4870]: E0312 00:12:55.442417 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb6f990-7335-4874-bea7-23ea9db79850" containerName="route-controller-manager" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.442434 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb6f990-7335-4874-bea7-23ea9db79850" containerName="route-controller-manager" Mar 12 00:12:55 crc kubenswrapper[4870]: E0312 00:12:55.442462 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f61f1af-812a-427f-a392-eec361571de3" containerName="extract-content" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.442481 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f61f1af-812a-427f-a392-eec361571de3" containerName="extract-content" Mar 12 00:12:55 crc kubenswrapper[4870]: E0312 00:12:55.442512 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f61f1af-812a-427f-a392-eec361571de3" containerName="registry-server" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.442529 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f61f1af-812a-427f-a392-eec361571de3" containerName="registry-server" Mar 12 00:12:55 crc kubenswrapper[4870]: E0312 00:12:55.442571 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e77934e-1d7b-432c-81a4-a9ede986a0d2" containerName="controller-manager" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.442588 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e77934e-1d7b-432c-81a4-a9ede986a0d2" containerName="controller-manager" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.443300 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb6f990-7335-4874-bea7-23ea9db79850" containerName="route-controller-manager" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.443355 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e77934e-1d7b-432c-81a4-a9ede986a0d2" containerName="controller-manager" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.443390 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f61f1af-812a-427f-a392-eec361571de3" containerName="registry-server" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.444387 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.447542 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r"] Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.450505 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.458002 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp"] Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.462904 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r"] Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.486058 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" event={"ID":"4e77934e-1d7b-432c-81a4-a9ede986a0d2","Type":"ContainerDied","Data":"4ab635eb8a9a4408d3bec16740c9d1bb4a97613137ca69a8439678e6a7837431"} Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.486431 4870 scope.go:117] "RemoveContainer" containerID="05317845e35ea424a4d136d08d27cb900b9c7292538c28b35654c6215786b4d4" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.486649 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78f8f679bb-bpr7s" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.488261 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" event={"ID":"8cb6f990-7335-4874-bea7-23ea9db79850","Type":"ContainerDied","Data":"b212de18dc6a0469334541296a92d7adeaf902334df0d0a9fd87029d0cd60680"} Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.488354 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.510866 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-proxy-ca-bundles\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.510927 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-client-ca\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.510959 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-config\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.510980 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-config\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.511005 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljt9w\" (UniqueName: \"kubernetes.io/projected/c21f0998-2c71-4243-91a2-7478cbfeea60-kube-api-access-ljt9w\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.511031 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a21602ca-9c73-4523-bb58-e165561d8d43-serving-cert\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.511054 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c21f0998-2c71-4243-91a2-7478cbfeea60-serving-cert\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.511077 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-client-ca\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.511104 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxfc2\" (UniqueName: \"kubernetes.io/projected/a21602ca-9c73-4523-bb58-e165561d8d43-kube-api-access-lxfc2\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.526222 4870 scope.go:117] "RemoveContainer" containerID="2c9d20a581ae7ac4e891144f3266b18621114d256719b155a039a0888e99e0bd" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.548951 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td"] Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.561299 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c77b9d598-q87td"] Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.571266 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78f8f679bb-bpr7s"] Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.575286 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78f8f679bb-bpr7s"] Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.612984 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfc2\" (UniqueName: \"kubernetes.io/projected/a21602ca-9c73-4523-bb58-e165561d8d43-kube-api-access-lxfc2\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.613063 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-proxy-ca-bundles\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.613121 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-client-ca\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.613163 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-config\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.613189 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-config\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.613216 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljt9w\" (UniqueName: \"kubernetes.io/projected/c21f0998-2c71-4243-91a2-7478cbfeea60-kube-api-access-ljt9w\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.613242 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a21602ca-9c73-4523-bb58-e165561d8d43-serving-cert\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.613264 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c21f0998-2c71-4243-91a2-7478cbfeea60-serving-cert\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.613285 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-client-ca\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.614238 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-client-ca\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.614254 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-client-ca\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.614336 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-proxy-ca-bundles\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.614821 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-config\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.616055 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-config\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.619311 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c21f0998-2c71-4243-91a2-7478cbfeea60-serving-cert\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.619417 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a21602ca-9c73-4523-bb58-e165561d8d43-serving-cert\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.628787 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljt9w\" (UniqueName: \"kubernetes.io/projected/c21f0998-2c71-4243-91a2-7478cbfeea60-kube-api-access-ljt9w\") pod \"route-controller-manager-7c675c78cc-xdz9r\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.636814 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxfc2\" (UniqueName: \"kubernetes.io/projected/a21602ca-9c73-4523-bb58-e165561d8d43-kube-api-access-lxfc2\") pod \"controller-manager-5b94cb8b7-n5sdp\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.672921 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.741616 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.802896 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:55 crc kubenswrapper[4870]: I0312 00:12:55.825742 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:56 crc kubenswrapper[4870]: I0312 00:12:56.024662 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:12:56 crc kubenswrapper[4870]: I0312 00:12:56.111046 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e77934e-1d7b-432c-81a4-a9ede986a0d2" path="/var/lib/kubelet/pods/4e77934e-1d7b-432c-81a4-a9ede986a0d2/volumes" Mar 12 00:12:56 crc kubenswrapper[4870]: I0312 00:12:56.111745 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb6f990-7335-4874-bea7-23ea9db79850" path="/var/lib/kubelet/pods/8cb6f990-7335-4874-bea7-23ea9db79850/volumes" Mar 12 00:12:56 crc kubenswrapper[4870]: I0312 00:12:56.234603 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:12:56 crc kubenswrapper[4870]: I0312 00:12:56.251343 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r"] Mar 12 00:12:56 crc kubenswrapper[4870]: W0312 00:12:56.257634 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc21f0998_2c71_4243_91a2_7478cbfeea60.slice/crio-03aa1ca0b49576083a412b17d7a9e43b67aa13fbf9ca9eb41f66410bfb68f6f9 WatchSource:0}: Error finding container 03aa1ca0b49576083a412b17d7a9e43b67aa13fbf9ca9eb41f66410bfb68f6f9: Status 404 returned error can't find the container with id 03aa1ca0b49576083a412b17d7a9e43b67aa13fbf9ca9eb41f66410bfb68f6f9 Mar 12 00:12:56 crc kubenswrapper[4870]: I0312 00:12:56.302604 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp"] Mar 12 00:12:56 crc kubenswrapper[4870]: W0312 00:12:56.317329 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21602ca_9c73_4523_bb58_e165561d8d43.slice/crio-f281948d57ac78146bcebd5cd122bba0bb90bd38f7e055f22391a4a5a86b9ba2 WatchSource:0}: Error finding container f281948d57ac78146bcebd5cd122bba0bb90bd38f7e055f22391a4a5a86b9ba2: Status 404 returned error can't find the container with id f281948d57ac78146bcebd5cd122bba0bb90bd38f7e055f22391a4a5a86b9ba2 Mar 12 00:12:56 crc kubenswrapper[4870]: I0312 00:12:56.496124 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" event={"ID":"c21f0998-2c71-4243-91a2-7478cbfeea60","Type":"ContainerStarted","Data":"03aa1ca0b49576083a412b17d7a9e43b67aa13fbf9ca9eb41f66410bfb68f6f9"} Mar 12 00:12:56 crc kubenswrapper[4870]: I0312 00:12:56.500101 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" event={"ID":"a21602ca-9c73-4523-bb58-e165561d8d43","Type":"ContainerStarted","Data":"f281948d57ac78146bcebd5cd122bba0bb90bd38f7e055f22391a4a5a86b9ba2"} Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.465380 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8hh9h"] Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.466071 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8hh9h" podUID="3ba3252e-f349-49ce-87d9-64172121150c" containerName="registry-server" containerID="cri-o://e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727" gracePeriod=2 Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.514269 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" event={"ID":"a21602ca-9c73-4523-bb58-e165561d8d43","Type":"ContainerStarted","Data":"dc09595cec8aca45de8095abb16c393f7f7843710552ed15219c91bba634d8b9"} Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.514335 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.519042 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" event={"ID":"c21f0998-2c71-4243-91a2-7478cbfeea60","Type":"ContainerStarted","Data":"eb58a25b806ae654ef70820eda99b86a12233d0f4613c1f8c546b8cc9c7b5319"} Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.519535 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.536010 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.536384 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.559979 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" podStartSLOduration=3.559951904 podStartE2EDuration="3.559951904s" podCreationTimestamp="2026-03-12 00:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:12:57.553371643 +0000 UTC m=+268.156787993" watchObservedRunningTime="2026-03-12 00:12:57.559951904 +0000 UTC m=+268.163368254" Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.601819 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" podStartSLOduration=3.60180313 podStartE2EDuration="3.60180313s" podCreationTimestamp="2026-03-12 00:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:12:57.601565393 +0000 UTC m=+268.204981703" watchObservedRunningTime="2026-03-12 00:12:57.60180313 +0000 UTC m=+268.205219440" Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.896852 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.941564 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-catalog-content\") pod \"3ba3252e-f349-49ce-87d9-64172121150c\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.941661 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-utilities\") pod \"3ba3252e-f349-49ce-87d9-64172121150c\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.941736 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fx54\" (UniqueName: \"kubernetes.io/projected/3ba3252e-f349-49ce-87d9-64172121150c-kube-api-access-6fx54\") pod \"3ba3252e-f349-49ce-87d9-64172121150c\" (UID: \"3ba3252e-f349-49ce-87d9-64172121150c\") " Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.943835 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-utilities" (OuterVolumeSpecName: "utilities") pod "3ba3252e-f349-49ce-87d9-64172121150c" (UID: "3ba3252e-f349-49ce-87d9-64172121150c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.948328 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba3252e-f349-49ce-87d9-64172121150c-kube-api-access-6fx54" (OuterVolumeSpecName: "kube-api-access-6fx54") pod "3ba3252e-f349-49ce-87d9-64172121150c" (UID: "3ba3252e-f349-49ce-87d9-64172121150c"). InnerVolumeSpecName "kube-api-access-6fx54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:57 crc kubenswrapper[4870]: I0312 00:12:57.992105 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ba3252e-f349-49ce-87d9-64172121150c" (UID: "3ba3252e-f349-49ce-87d9-64172121150c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.042909 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.042961 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ba3252e-f349-49ce-87d9-64172121150c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.042984 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fx54\" (UniqueName: \"kubernetes.io/projected/3ba3252e-f349-49ce-87d9-64172121150c-kube-api-access-6fx54\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.462052 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5gt9r"] Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.462423 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5gt9r" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerName="registry-server" containerID="cri-o://c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4" gracePeriod=2 Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.528418 4870 generic.go:334] "Generic (PLEG): container finished" podID="3ba3252e-f349-49ce-87d9-64172121150c" containerID="e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727" exitCode=0 Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.528510 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hh9h" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.528501 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hh9h" event={"ID":"3ba3252e-f349-49ce-87d9-64172121150c","Type":"ContainerDied","Data":"e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727"} Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.528582 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hh9h" event={"ID":"3ba3252e-f349-49ce-87d9-64172121150c","Type":"ContainerDied","Data":"a3376767c9ab4285e9f0b69ae77a4c1d7da1be61193ace45a09c6c034120607b"} Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.528617 4870 scope.go:117] "RemoveContainer" containerID="e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.640315 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8hh9h"] Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.643720 4870 scope.go:117] "RemoveContainer" containerID="34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.646786 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8hh9h"] Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.665553 4870 scope.go:117] "RemoveContainer" containerID="615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.696874 4870 scope.go:117] "RemoveContainer" containerID="e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727" Mar 12 00:12:58 crc kubenswrapper[4870]: E0312 00:12:58.697499 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727\": container with ID starting with e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727 not found: ID does not exist" containerID="e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.697558 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727"} err="failed to get container status \"e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727\": rpc error: code = NotFound desc = could not find container \"e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727\": container with ID starting with e44e21d6d8e3be268a642c2136bb429ce30fa78266389e8144620bbb73d7c727 not found: ID does not exist" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.697595 4870 scope.go:117] "RemoveContainer" containerID="34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c" Mar 12 00:12:58 crc kubenswrapper[4870]: E0312 00:12:58.698223 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c\": container with ID starting with 34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c not found: ID does not exist" containerID="34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.698270 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c"} err="failed to get container status \"34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c\": rpc error: code = NotFound desc = could not find container \"34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c\": container with ID starting with 34330fd6624a55669006102aaa4ec820c8ee807ba254485adb4b78a4189dff6c not found: ID does not exist" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.698297 4870 scope.go:117] "RemoveContainer" containerID="615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1" Mar 12 00:12:58 crc kubenswrapper[4870]: E0312 00:12:58.699006 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1\": container with ID starting with 615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1 not found: ID does not exist" containerID="615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.699138 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1"} err="failed to get container status \"615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1\": rpc error: code = NotFound desc = could not find container \"615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1\": container with ID starting with 615f2534c39bb9722d34c0ad0c85a68c280662c45f75f065b55231d5ff017fb1 not found: ID does not exist" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.871851 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.877383 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:12:58 crc kubenswrapper[4870]: I0312 00:12:58.922737 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.055549 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-catalog-content\") pod \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.055974 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-utilities\") pod \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.056068 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfg7k\" (UniqueName: \"kubernetes.io/projected/985b1034-4300-4cdf-a09a-33d70a0ea7b0-kube-api-access-hfg7k\") pod \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\" (UID: \"985b1034-4300-4cdf-a09a-33d70a0ea7b0\") " Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.057587 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-utilities" (OuterVolumeSpecName: "utilities") pod "985b1034-4300-4cdf-a09a-33d70a0ea7b0" (UID: "985b1034-4300-4cdf-a09a-33d70a0ea7b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.062261 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/985b1034-4300-4cdf-a09a-33d70a0ea7b0-kube-api-access-hfg7k" (OuterVolumeSpecName: "kube-api-access-hfg7k") pod "985b1034-4300-4cdf-a09a-33d70a0ea7b0" (UID: "985b1034-4300-4cdf-a09a-33d70a0ea7b0"). InnerVolumeSpecName "kube-api-access-hfg7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.128284 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "985b1034-4300-4cdf-a09a-33d70a0ea7b0" (UID: "985b1034-4300-4cdf-a09a-33d70a0ea7b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.157900 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfg7k\" (UniqueName: \"kubernetes.io/projected/985b1034-4300-4cdf-a09a-33d70a0ea7b0-kube-api-access-hfg7k\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.158528 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.158659 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/985b1034-4300-4cdf-a09a-33d70a0ea7b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.300411 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.353583 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.537604 4870 generic.go:334] "Generic (PLEG): container finished" podID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerID="c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4" exitCode=0 Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.537722 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5gt9r" event={"ID":"985b1034-4300-4cdf-a09a-33d70a0ea7b0","Type":"ContainerDied","Data":"c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4"} Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.537763 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5gt9r" event={"ID":"985b1034-4300-4cdf-a09a-33d70a0ea7b0","Type":"ContainerDied","Data":"07b3bff8d24561e4e8ca049873e7feade2cd9bbb30a978ace906c05fc82bc751"} Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.537783 4870 scope.go:117] "RemoveContainer" containerID="c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.538983 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5gt9r" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.560258 4870 scope.go:117] "RemoveContainer" containerID="c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.575519 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5gt9r"] Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.578290 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5gt9r"] Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.592272 4870 scope.go:117] "RemoveContainer" containerID="b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.635921 4870 scope.go:117] "RemoveContainer" containerID="c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4" Mar 12 00:12:59 crc kubenswrapper[4870]: E0312 00:12:59.636688 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4\": container with ID starting with c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4 not found: ID does not exist" containerID="c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.636726 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4"} err="failed to get container status \"c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4\": rpc error: code = NotFound desc = could not find container \"c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4\": container with ID starting with c325cef51e97bbe638cae902e25baa8aad5a773f3b6db66822652bab465392a4 not found: ID does not exist" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.636752 4870 scope.go:117] "RemoveContainer" containerID="c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc" Mar 12 00:12:59 crc kubenswrapper[4870]: E0312 00:12:59.637569 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc\": container with ID starting with c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc not found: ID does not exist" containerID="c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.637604 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc"} err="failed to get container status \"c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc\": rpc error: code = NotFound desc = could not find container \"c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc\": container with ID starting with c529e0393314a24fbab4f97b48c1b38599a9607401ac2fabe680fba33e03c7dc not found: ID does not exist" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.637626 4870 scope.go:117] "RemoveContainer" containerID="b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d" Mar 12 00:12:59 crc kubenswrapper[4870]: E0312 00:12:59.637881 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d\": container with ID starting with b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d not found: ID does not exist" containerID="b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d" Mar 12 00:12:59 crc kubenswrapper[4870]: I0312 00:12:59.637908 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d"} err="failed to get container status \"b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d\": rpc error: code = NotFound desc = could not find container \"b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d\": container with ID starting with b7425d0122dbfb9eeeb345b82b544fd07a2bbdac51136f8ca78253d75c90b68d not found: ID does not exist" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.082194 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" podUID="368b0e75-e87a-43f9-9369-588871bf28be" containerName="oauth-openshift" containerID="cri-o://a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e" gracePeriod=15 Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.111965 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ba3252e-f349-49ce-87d9-64172121150c" path="/var/lib/kubelet/pods/3ba3252e-f349-49ce-87d9-64172121150c/volumes" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.113076 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" path="/var/lib/kubelet/pods/985b1034-4300-4cdf-a09a-33d70a0ea7b0/volumes" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.519891 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.545880 4870 generic.go:334] "Generic (PLEG): container finished" podID="368b0e75-e87a-43f9-9369-588871bf28be" containerID="a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e" exitCode=0 Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.545924 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" event={"ID":"368b0e75-e87a-43f9-9369-588871bf28be","Type":"ContainerDied","Data":"a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e"} Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.545985 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" event={"ID":"368b0e75-e87a-43f9-9369-588871bf28be","Type":"ContainerDied","Data":"8babf072d8163389193d6157059eab62d4a37451c3217f76c6e5c63bb846ebc9"} Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.546020 4870 scope.go:117] "RemoveContainer" containerID="a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.546207 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7gfg9" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.570791 4870 scope.go:117] "RemoveContainer" containerID="a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e" Mar 12 00:13:00 crc kubenswrapper[4870]: E0312 00:13:00.573366 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e\": container with ID starting with a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e not found: ID does not exist" containerID="a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.573415 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e"} err="failed to get container status \"a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e\": rpc error: code = NotFound desc = could not find container \"a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e\": container with ID starting with a1f214ff220b8f4127585afc23f429579c705c022ca26b10611a042301a3f01e not found: ID does not exist" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.676308 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78dhb\" (UniqueName: \"kubernetes.io/projected/368b0e75-e87a-43f9-9369-588871bf28be-kube-api-access-78dhb\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.676630 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-audit-policies\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.676741 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-error\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.677358 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-router-certs\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.677536 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-trusted-ca-bundle\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.677630 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-service-ca\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.677706 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-session\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.677787 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-login\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.677872 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/368b0e75-e87a-43f9-9369-588871bf28be-audit-dir\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.677966 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-provider-selection\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.678040 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-serving-cert\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.677374 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.677958 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/368b0e75-e87a-43f9-9369-588871bf28be-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.678405 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-ocp-branding-template\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.678482 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-cliconfig\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.678563 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-idp-0-file-data\") pod \"368b0e75-e87a-43f9-9369-588871bf28be\" (UID: \"368b0e75-e87a-43f9-9369-588871bf28be\") " Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.679825 4870 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/368b0e75-e87a-43f9-9369-588871bf28be-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.679988 4870 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.678885 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.679024 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.679616 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.681574 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.681737 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/368b0e75-e87a-43f9-9369-588871bf28be-kube-api-access-78dhb" (OuterVolumeSpecName: "kube-api-access-78dhb") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "kube-api-access-78dhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.682448 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.693555 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.693590 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.694117 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.695533 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.696529 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.696648 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "368b0e75-e87a-43f9-9369-588871bf28be" (UID: "368b0e75-e87a-43f9-9369-588871bf28be"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781130 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781186 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781223 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781235 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78dhb\" (UniqueName: \"kubernetes.io/projected/368b0e75-e87a-43f9-9369-588871bf28be-kube-api-access-78dhb\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781245 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781254 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781263 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781274 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781282 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781291 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781301 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.781311 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/368b0e75-e87a-43f9-9369-588871bf28be-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.872081 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7gfg9"] Mar 12 00:13:00 crc kubenswrapper[4870]: I0312 00:13:00.875513 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7gfg9"] Mar 12 00:13:01 crc kubenswrapper[4870]: I0312 00:13:01.850426 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r5h2s"] Mar 12 00:13:01 crc kubenswrapper[4870]: I0312 00:13:01.850777 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r5h2s" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerName="registry-server" containerID="cri-o://1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300" gracePeriod=2 Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.109351 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="368b0e75-e87a-43f9-9369-588871bf28be" path="/var/lib/kubelet/pods/368b0e75-e87a-43f9-9369-588871bf28be/volumes" Mar 12 00:13:02 crc kubenswrapper[4870]: E0312 00:13:02.115233 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d6a8bb4_df10_46c3_91e6_826e501be09f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b0ce3c_f432_48f4_81ed_62cf96995f8d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice/crio-df83565821410054a336bf3c21192e42145e4f3f3fde02334542dfbbf52cea36\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice\": RecentStats: unable to find data in memory cache]" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.431767 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.504296 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-catalog-content\") pod \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.504347 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-utilities\") pod \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.504468 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct4t7\" (UniqueName: \"kubernetes.io/projected/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-kube-api-access-ct4t7\") pod \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\" (UID: \"5a32152f-50ce-4712-8ea4-dc6b72dc6f08\") " Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.505539 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-utilities" (OuterVolumeSpecName: "utilities") pod "5a32152f-50ce-4712-8ea4-dc6b72dc6f08" (UID: "5a32152f-50ce-4712-8ea4-dc6b72dc6f08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.514292 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-kube-api-access-ct4t7" (OuterVolumeSpecName: "kube-api-access-ct4t7") pod "5a32152f-50ce-4712-8ea4-dc6b72dc6f08" (UID: "5a32152f-50ce-4712-8ea4-dc6b72dc6f08"). InnerVolumeSpecName "kube-api-access-ct4t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.564263 4870 generic.go:334] "Generic (PLEG): container finished" podID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerID="1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300" exitCode=0 Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.564380 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5h2s" event={"ID":"5a32152f-50ce-4712-8ea4-dc6b72dc6f08","Type":"ContainerDied","Data":"1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300"} Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.564451 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5h2s" event={"ID":"5a32152f-50ce-4712-8ea4-dc6b72dc6f08","Type":"ContainerDied","Data":"c1507d8d10e043a4144952f8b20692f160f50f3b9559ea356e9a2e7c3f106e9a"} Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.564464 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5h2s" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.564475 4870 scope.go:117] "RemoveContainer" containerID="1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.585847 4870 scope.go:117] "RemoveContainer" containerID="5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.606548 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct4t7\" (UniqueName: \"kubernetes.io/projected/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-kube-api-access-ct4t7\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.606598 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.611316 4870 scope.go:117] "RemoveContainer" containerID="edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.632178 4870 scope.go:117] "RemoveContainer" containerID="1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300" Mar 12 00:13:02 crc kubenswrapper[4870]: E0312 00:13:02.632915 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300\": container with ID starting with 1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300 not found: ID does not exist" containerID="1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.632967 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300"} err="failed to get container status \"1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300\": rpc error: code = NotFound desc = could not find container \"1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300\": container with ID starting with 1aeee90b62a252cc9e6a6482b23cffd0bac98707a6c8fd9ca941a281bc898300 not found: ID does not exist" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.633007 4870 scope.go:117] "RemoveContainer" containerID="5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003" Mar 12 00:13:02 crc kubenswrapper[4870]: E0312 00:13:02.633584 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003\": container with ID starting with 5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003 not found: ID does not exist" containerID="5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.633616 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003"} err="failed to get container status \"5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003\": rpc error: code = NotFound desc = could not find container \"5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003\": container with ID starting with 5c1a5b915638fb734a68a6408869bfcc738cafc13499899a9dd1e50cf8bbc003 not found: ID does not exist" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.633636 4870 scope.go:117] "RemoveContainer" containerID="edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4" Mar 12 00:13:02 crc kubenswrapper[4870]: E0312 00:13:02.634113 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4\": container with ID starting with edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4 not found: ID does not exist" containerID="edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.634271 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4"} err="failed to get container status \"edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4\": rpc error: code = NotFound desc = could not find container \"edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4\": container with ID starting with edd4df09f53681310264ef30161e9f9ee21b2117ac478f6a7f760539726a73b4 not found: ID does not exist" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.656385 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a32152f-50ce-4712-8ea4-dc6b72dc6f08" (UID: "5a32152f-50ce-4712-8ea4-dc6b72dc6f08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.715525 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a32152f-50ce-4712-8ea4-dc6b72dc6f08-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.909843 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r5h2s"] Mar 12 00:13:02 crc kubenswrapper[4870]: I0312 00:13:02.920382 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r5h2s"] Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.452094 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl"] Mar 12 00:13:03 crc kubenswrapper[4870]: E0312 00:13:03.452549 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba3252e-f349-49ce-87d9-64172121150c" containerName="registry-server" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.452587 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba3252e-f349-49ce-87d9-64172121150c" containerName="registry-server" Mar 12 00:13:03 crc kubenswrapper[4870]: E0312 00:13:03.452609 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba3252e-f349-49ce-87d9-64172121150c" containerName="extract-content" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.452623 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba3252e-f349-49ce-87d9-64172121150c" containerName="extract-content" Mar 12 00:13:03 crc kubenswrapper[4870]: E0312 00:13:03.453294 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba3252e-f349-49ce-87d9-64172121150c" containerName="extract-utilities" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453319 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba3252e-f349-49ce-87d9-64172121150c" containerName="extract-utilities" Mar 12 00:13:03 crc kubenswrapper[4870]: E0312 00:13:03.453340 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerName="extract-utilities" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453353 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerName="extract-utilities" Mar 12 00:13:03 crc kubenswrapper[4870]: E0312 00:13:03.453386 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerName="extract-content" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453399 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerName="extract-content" Mar 12 00:13:03 crc kubenswrapper[4870]: E0312 00:13:03.453421 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerName="extract-utilities" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453436 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerName="extract-utilities" Mar 12 00:13:03 crc kubenswrapper[4870]: E0312 00:13:03.453458 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368b0e75-e87a-43f9-9369-588871bf28be" containerName="oauth-openshift" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453471 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="368b0e75-e87a-43f9-9369-588871bf28be" containerName="oauth-openshift" Mar 12 00:13:03 crc kubenswrapper[4870]: E0312 00:13:03.453505 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerName="extract-content" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453522 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerName="extract-content" Mar 12 00:13:03 crc kubenswrapper[4870]: E0312 00:13:03.453552 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerName="registry-server" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453589 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerName="registry-server" Mar 12 00:13:03 crc kubenswrapper[4870]: E0312 00:13:03.453642 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerName="registry-server" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453655 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerName="registry-server" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453928 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="985b1034-4300-4cdf-a09a-33d70a0ea7b0" containerName="registry-server" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453959 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="368b0e75-e87a-43f9-9369-588871bf28be" containerName="oauth-openshift" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453976 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba3252e-f349-49ce-87d9-64172121150c" containerName="registry-server" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.453992 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" containerName="registry-server" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.454818 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.458734 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.459913 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.460131 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.462001 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.466600 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.480044 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.480389 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.483572 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.483758 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.483900 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.484052 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.484228 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.484375 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.484509 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.496811 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.513822 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl"] Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.527714 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.527767 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.527805 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.527830 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnw5m\" (UniqueName: \"kubernetes.io/projected/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-kube-api-access-dnw5m\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.527860 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.528022 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.528083 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-template-error\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.528208 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-audit-dir\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.528286 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-session\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.528342 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.528372 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-template-login\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.528460 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.528513 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.528600 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-audit-policies\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630096 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630194 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnw5m\" (UniqueName: \"kubernetes.io/projected/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-kube-api-access-dnw5m\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630235 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630267 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630292 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-template-error\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630336 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-audit-dir\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630363 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-session\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630392 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630642 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-audit-dir\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630413 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-template-login\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630959 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.630993 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.631017 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-audit-policies\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.631040 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.631089 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.631227 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.631627 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.632402 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-audit-policies\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.633037 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.634857 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.635176 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-template-error\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.634858 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-session\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.635053 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.634898 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.635989 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.639585 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.642494 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-v4-0-config-user-template-login\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.648630 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnw5m\" (UniqueName: \"kubernetes.io/projected/a1ca7e31-8204-4ccd-85e7-2a37311e0de1-kube-api-access-dnw5m\") pod \"oauth-openshift-5f94cbd6cf-ncvfl\" (UID: \"a1ca7e31-8204-4ccd-85e7-2a37311e0de1\") " pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:03 crc kubenswrapper[4870]: I0312 00:13:03.822028 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:04 crc kubenswrapper[4870]: I0312 00:13:04.111586 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a32152f-50ce-4712-8ea4-dc6b72dc6f08" path="/var/lib/kubelet/pods/5a32152f-50ce-4712-8ea4-dc6b72dc6f08/volumes" Mar 12 00:13:04 crc kubenswrapper[4870]: I0312 00:13:04.272172 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl"] Mar 12 00:13:04 crc kubenswrapper[4870]: I0312 00:13:04.580966 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" event={"ID":"a1ca7e31-8204-4ccd-85e7-2a37311e0de1","Type":"ContainerStarted","Data":"bc242901bfd7fb071ebf5a395749ba5ce5d805d4f665eac8020b3e6d32983b51"} Mar 12 00:13:05 crc kubenswrapper[4870]: I0312 00:13:05.590843 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" event={"ID":"a1ca7e31-8204-4ccd-85e7-2a37311e0de1","Type":"ContainerStarted","Data":"00dfbc37f04353aa102b07af1b0a050128f72ebacc968eb5c09abb3252e21cb8"} Mar 12 00:13:05 crc kubenswrapper[4870]: I0312 00:13:05.591417 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:05 crc kubenswrapper[4870]: I0312 00:13:05.599343 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" Mar 12 00:13:05 crc kubenswrapper[4870]: I0312 00:13:05.621602 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5f94cbd6cf-ncvfl" podStartSLOduration=30.621584656 podStartE2EDuration="30.621584656s" podCreationTimestamp="2026-03-12 00:12:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:13:05.617518328 +0000 UTC m=+276.220934678" watchObservedRunningTime="2026-03-12 00:13:05.621584656 +0000 UTC m=+276.225000956" Mar 12 00:13:12 crc kubenswrapper[4870]: E0312 00:13:12.228315 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d6a8bb4_df10_46c3_91e6_826e501be09f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b0ce3c_f432_48f4_81ed_62cf96995f8d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice/crio-df83565821410054a336bf3c21192e42145e4f3f3fde02334542dfbbf52cea36\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice\": RecentStats: unable to find data in memory cache]" Mar 12 00:13:14 crc kubenswrapper[4870]: I0312 00:13:14.297846 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp"] Mar 12 00:13:14 crc kubenswrapper[4870]: I0312 00:13:14.299962 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" containerName="controller-manager" containerID="cri-o://dc09595cec8aca45de8095abb16c393f7f7843710552ed15219c91bba634d8b9" gracePeriod=30 Mar 12 00:13:14 crc kubenswrapper[4870]: I0312 00:13:14.390644 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r"] Mar 12 00:13:14 crc kubenswrapper[4870]: I0312 00:13:14.391085 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" containerName="route-controller-manager" containerID="cri-o://eb58a25b806ae654ef70820eda99b86a12233d0f4613c1f8c546b8cc9c7b5319" gracePeriod=30 Mar 12 00:13:14 crc kubenswrapper[4870]: I0312 00:13:14.661876 4870 generic.go:334] "Generic (PLEG): container finished" podID="c21f0998-2c71-4243-91a2-7478cbfeea60" containerID="eb58a25b806ae654ef70820eda99b86a12233d0f4613c1f8c546b8cc9c7b5319" exitCode=0 Mar 12 00:13:14 crc kubenswrapper[4870]: I0312 00:13:14.661932 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" event={"ID":"c21f0998-2c71-4243-91a2-7478cbfeea60","Type":"ContainerDied","Data":"eb58a25b806ae654ef70820eda99b86a12233d0f4613c1f8c546b8cc9c7b5319"} Mar 12 00:13:14 crc kubenswrapper[4870]: I0312 00:13:14.662988 4870 generic.go:334] "Generic (PLEG): container finished" podID="a21602ca-9c73-4523-bb58-e165561d8d43" containerID="dc09595cec8aca45de8095abb16c393f7f7843710552ed15219c91bba634d8b9" exitCode=0 Mar 12 00:13:14 crc kubenswrapper[4870]: I0312 00:13:14.663013 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" event={"ID":"a21602ca-9c73-4523-bb58-e165561d8d43","Type":"ContainerDied","Data":"dc09595cec8aca45de8095abb16c393f7f7843710552ed15219c91bba634d8b9"} Mar 12 00:13:14 crc kubenswrapper[4870]: I0312 00:13:14.873506 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:13:14 crc kubenswrapper[4870]: I0312 00:13:14.916791 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.008080 4870 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.008354 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" containerName="route-controller-manager" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.008369 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" containerName="route-controller-manager" Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.008394 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" containerName="controller-manager" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.008400 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" containerName="controller-manager" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.008487 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" containerName="controller-manager" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.008502 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" containerName="route-controller-manager" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.008871 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.011470 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxfc2\" (UniqueName: \"kubernetes.io/projected/a21602ca-9c73-4523-bb58-e165561d8d43-kube-api-access-lxfc2\") pod \"a21602ca-9c73-4523-bb58-e165561d8d43\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.011511 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-config\") pod \"c21f0998-2c71-4243-91a2-7478cbfeea60\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.011561 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-proxy-ca-bundles\") pod \"a21602ca-9c73-4523-bb58-e165561d8d43\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.011600 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a21602ca-9c73-4523-bb58-e165561d8d43-serving-cert\") pod \"a21602ca-9c73-4523-bb58-e165561d8d43\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.011619 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-config\") pod \"a21602ca-9c73-4523-bb58-e165561d8d43\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.011646 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljt9w\" (UniqueName: \"kubernetes.io/projected/c21f0998-2c71-4243-91a2-7478cbfeea60-kube-api-access-ljt9w\") pod \"c21f0998-2c71-4243-91a2-7478cbfeea60\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.011667 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-client-ca\") pod \"a21602ca-9c73-4523-bb58-e165561d8d43\" (UID: \"a21602ca-9c73-4523-bb58-e165561d8d43\") " Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.011696 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-client-ca\") pod \"c21f0998-2c71-4243-91a2-7478cbfeea60\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.011728 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c21f0998-2c71-4243-91a2-7478cbfeea60-serving-cert\") pod \"c21f0998-2c71-4243-91a2-7478cbfeea60\" (UID: \"c21f0998-2c71-4243-91a2-7478cbfeea60\") " Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.012669 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a21602ca-9c73-4523-bb58-e165561d8d43" (UID: "a21602ca-9c73-4523-bb58-e165561d8d43"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.012835 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-config" (OuterVolumeSpecName: "config") pod "c21f0998-2c71-4243-91a2-7478cbfeea60" (UID: "c21f0998-2c71-4243-91a2-7478cbfeea60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.013208 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-client-ca" (OuterVolumeSpecName: "client-ca") pod "c21f0998-2c71-4243-91a2-7478cbfeea60" (UID: "c21f0998-2c71-4243-91a2-7478cbfeea60"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.013245 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-client-ca" (OuterVolumeSpecName: "client-ca") pod "a21602ca-9c73-4523-bb58-e165561d8d43" (UID: "a21602ca-9c73-4523-bb58-e165561d8d43"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.013835 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-config" (OuterVolumeSpecName: "config") pod "a21602ca-9c73-4523-bb58-e165561d8d43" (UID: "a21602ca-9c73-4523-bb58-e165561d8d43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.017459 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21602ca-9c73-4523-bb58-e165561d8d43-kube-api-access-lxfc2" (OuterVolumeSpecName: "kube-api-access-lxfc2") pod "a21602ca-9c73-4523-bb58-e165561d8d43" (UID: "a21602ca-9c73-4523-bb58-e165561d8d43"). InnerVolumeSpecName "kube-api-access-lxfc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.017539 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21f0998-2c71-4243-91a2-7478cbfeea60-kube-api-access-ljt9w" (OuterVolumeSpecName: "kube-api-access-ljt9w") pod "c21f0998-2c71-4243-91a2-7478cbfeea60" (UID: "c21f0998-2c71-4243-91a2-7478cbfeea60"). InnerVolumeSpecName "kube-api-access-ljt9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.017979 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21602ca-9c73-4523-bb58-e165561d8d43-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a21602ca-9c73-4523-bb58-e165561d8d43" (UID: "a21602ca-9c73-4523-bb58-e165561d8d43"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.018025 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c21f0998-2c71-4243-91a2-7478cbfeea60-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c21f0998-2c71-4243-91a2-7478cbfeea60" (UID: "c21f0998-2c71-4243-91a2-7478cbfeea60"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.049922 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.060834 4870 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.061209 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0" gracePeriod=15 Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.061275 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0" gracePeriod=15 Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.061341 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe" gracePeriod=15 Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.061394 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135" gracePeriod=15 Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.061437 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd" gracePeriod=15 Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064227 4870 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.064468 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064489 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.064502 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064512 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.064523 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064531 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.064544 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064552 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.064573 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064581 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.064593 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064601 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.064613 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064621 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.064635 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064643 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.064655 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064663 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064772 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064785 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064794 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064807 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064819 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064832 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064841 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064852 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.064981 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.064990 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.065107 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.114716 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.114764 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.114788 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.114816 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.114966 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115090 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115207 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115303 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115426 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljt9w\" (UniqueName: \"kubernetes.io/projected/c21f0998-2c71-4243-91a2-7478cbfeea60-kube-api-access-ljt9w\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115450 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115464 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115476 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c21f0998-2c71-4243-91a2-7478cbfeea60-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115491 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxfc2\" (UniqueName: \"kubernetes.io/projected/a21602ca-9c73-4523-bb58-e165561d8d43-kube-api-access-lxfc2\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115503 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21f0998-2c71-4243-91a2-7478cbfeea60-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115515 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115527 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a21602ca-9c73-4523-bb58-e165561d8d43-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.115540 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a21602ca-9c73-4523-bb58-e165561d8d43-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216107 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216192 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216228 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216271 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216289 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216331 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216373 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216396 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216476 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216537 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216566 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216590 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216909 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.216956 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.217405 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.217465 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.332761 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:15 crc kubenswrapper[4870]: W0312 00:13:15.352721 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-93b2b8a0644a7f8825e161c58eb2ff103e4c79bac6c806fd51f87444a2994ca8 WatchSource:0}: Error finding container 93b2b8a0644a7f8825e161c58eb2ff103e4c79bac6c806fd51f87444a2994ca8: Status 404 returned error can't find the container with id 93b2b8a0644a7f8825e161c58eb2ff103e4c79bac6c806fd51f87444a2994ca8 Mar 12 00:13:15 crc kubenswrapper[4870]: E0312 00:13:15.355746 4870 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189befaad86dfd74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:13:15.354987892 +0000 UTC m=+285.958404212,LastTimestamp:2026-03-12 00:13:15.354987892 +0000 UTC m=+285.958404212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.670548 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.670544 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" event={"ID":"a21602ca-9c73-4523-bb58-e165561d8d43","Type":"ContainerDied","Data":"f281948d57ac78146bcebd5cd122bba0bb90bd38f7e055f22391a4a5a86b9ba2"} Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.670924 4870 scope.go:117] "RemoveContainer" containerID="dc09595cec8aca45de8095abb16c393f7f7843710552ed15219c91bba634d8b9" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.671585 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.672288 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.672644 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.674294 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.676976 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.677802 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0" exitCode=0 Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.677834 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe" exitCode=0 Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.677847 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135" exitCode=0 Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.677861 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd" exitCode=2 Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.682186 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" event={"ID":"c21f0998-2c71-4243-91a2-7478cbfeea60","Type":"ContainerDied","Data":"03aa1ca0b49576083a412b17d7a9e43b67aa13fbf9ca9eb41f66410bfb68f6f9"} Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.682263 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.684170 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.684406 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.684668 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.684994 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.686714 4870 generic.go:334] "Generic (PLEG): container finished" podID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" containerID="c2518116aabefc160e23c5e45118bc636c9496332ec02bc174106da71fa920d3" exitCode=0 Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.686784 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29fd0efd-a422-4b2d-b22c-f2bda94a368d","Type":"ContainerDied","Data":"c2518116aabefc160e23c5e45118bc636c9496332ec02bc174106da71fa920d3"} Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.687280 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.687487 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.687646 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.687804 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.687977 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.688123 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.688321 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.688489 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.688633 4870 status_manager.go:851] "Failed to get status for pod" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.693246 4870 scope.go:117] "RemoveContainer" containerID="87966015c93d9f6b80016f60497fac7096ecdd0798637e36e2c6d484ba8ec7a0" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.694474 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77"} Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.694629 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"93b2b8a0644a7f8825e161c58eb2ff103e4c79bac6c806fd51f87444a2994ca8"} Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.696538 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.696824 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.696999 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.697187 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.697379 4870 status_manager.go:851] "Failed to get status for pod" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.725430 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.725631 4870 status_manager.go:851] "Failed to get status for pod" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.725820 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.725999 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.726217 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:15 crc kubenswrapper[4870]: I0312 00:13:15.731873 4870 scope.go:117] "RemoveContainer" containerID="eb58a25b806ae654ef70820eda99b86a12233d0f4613c1f8c546b8cc9c7b5319" Mar 12 00:13:16 crc kubenswrapper[4870]: I0312 00:13:16.704383 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 00:13:16 crc kubenswrapper[4870]: E0312 00:13:16.758471 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:13:16Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:13:16Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:13:16Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:13:16Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:16 crc kubenswrapper[4870]: E0312 00:13:16.758834 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:16 crc kubenswrapper[4870]: E0312 00:13:16.759115 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:16 crc kubenswrapper[4870]: E0312 00:13:16.759462 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:16 crc kubenswrapper[4870]: E0312 00:13:16.759744 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:16 crc kubenswrapper[4870]: E0312 00:13:16.759778 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.055194 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.056273 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.057086 4870 status_manager.go:851] "Failed to get status for pod" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.057949 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.058709 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.139698 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-var-lock\") pod \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.139787 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kubelet-dir\") pod \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.139850 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kube-api-access\") pod \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\" (UID: \"29fd0efd-a422-4b2d-b22c-f2bda94a368d\") " Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.140802 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-var-lock" (OuterVolumeSpecName: "var-lock") pod "29fd0efd-a422-4b2d-b22c-f2bda94a368d" (UID: "29fd0efd-a422-4b2d-b22c-f2bda94a368d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.140853 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "29fd0efd-a422-4b2d-b22c-f2bda94a368d" (UID: "29fd0efd-a422-4b2d-b22c-f2bda94a368d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.166521 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "29fd0efd-a422-4b2d-b22c-f2bda94a368d" (UID: "29fd0efd-a422-4b2d-b22c-f2bda94a368d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.241001 4870 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.241336 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29fd0efd-a422-4b2d-b22c-f2bda94a368d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.241352 4870 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29fd0efd-a422-4b2d-b22c-f2bda94a368d-var-lock\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.385521 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.386734 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.387441 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.388005 4870 status_manager.go:851] "Failed to get status for pod" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.388440 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.388804 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.389381 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.444108 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.444324 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.444351 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.444438 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.444503 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.444530 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.444829 4870 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.444865 4870 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.444888 4870 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.726683 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.728028 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0" exitCode=0 Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.728229 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.728242 4870 scope.go:117] "RemoveContainer" containerID="e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.731401 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29fd0efd-a422-4b2d-b22c-f2bda94a368d","Type":"ContainerDied","Data":"777a189827dd205699559065ca791e65a0fce965345393a9ac81c1684dbd9269"} Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.731468 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="777a189827dd205699559065ca791e65a0fce965345393a9ac81c1684dbd9269" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.731473 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.754021 4870 scope.go:117] "RemoveContainer" containerID="ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.764488 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.765196 4870 status_manager.go:851] "Failed to get status for pod" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.765981 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.766509 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.767086 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.767825 4870 status_manager.go:851] "Failed to get status for pod" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.768204 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.768760 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.769628 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.770188 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.777477 4870 scope.go:117] "RemoveContainer" containerID="71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.800474 4870 scope.go:117] "RemoveContainer" containerID="ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.828137 4870 scope.go:117] "RemoveContainer" containerID="e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.857411 4870 scope.go:117] "RemoveContainer" containerID="11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.886411 4870 scope.go:117] "RemoveContainer" containerID="e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0" Mar 12 00:13:17 crc kubenswrapper[4870]: E0312 00:13:17.886962 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\": container with ID starting with e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0 not found: ID does not exist" containerID="e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.887037 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0"} err="failed to get container status \"e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\": rpc error: code = NotFound desc = could not find container \"e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0\": container with ID starting with e3005a9ea6b2927a02745c0782187cffa63170087815ef4c7c4bfb4e581775c0 not found: ID does not exist" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.887087 4870 scope.go:117] "RemoveContainer" containerID="ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe" Mar 12 00:13:17 crc kubenswrapper[4870]: E0312 00:13:17.887633 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\": container with ID starting with ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe not found: ID does not exist" containerID="ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.887699 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe"} err="failed to get container status \"ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\": rpc error: code = NotFound desc = could not find container \"ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe\": container with ID starting with ee72d3feb4870db8eb7f9caa58ac0f5f547bb4ec1c39e7f5a752eb70325fddbe not found: ID does not exist" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.887737 4870 scope.go:117] "RemoveContainer" containerID="71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135" Mar 12 00:13:17 crc kubenswrapper[4870]: E0312 00:13:17.888293 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\": container with ID starting with 71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135 not found: ID does not exist" containerID="71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.888345 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135"} err="failed to get container status \"71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\": rpc error: code = NotFound desc = could not find container \"71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135\": container with ID starting with 71733d47d931e15e41eb9001cc771ded138f10f6452c16606d4be382a1efe135 not found: ID does not exist" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.888383 4870 scope.go:117] "RemoveContainer" containerID="ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd" Mar 12 00:13:17 crc kubenswrapper[4870]: E0312 00:13:17.888950 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\": container with ID starting with ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd not found: ID does not exist" containerID="ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.889002 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd"} err="failed to get container status \"ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\": rpc error: code = NotFound desc = could not find container \"ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd\": container with ID starting with ebb5b87f78bdc4a07193a4b1573d6d36a2d0fe2637715a129f2460c5958c46fd not found: ID does not exist" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.889031 4870 scope.go:117] "RemoveContainer" containerID="e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0" Mar 12 00:13:17 crc kubenswrapper[4870]: E0312 00:13:17.889597 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\": container with ID starting with e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0 not found: ID does not exist" containerID="e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.889687 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0"} err="failed to get container status \"e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\": rpc error: code = NotFound desc = could not find container \"e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0\": container with ID starting with e1c6824f2c9b1498b3ee483ddeaf92b17b6aa09ac3d128e35baa9c7db9f2dfc0 not found: ID does not exist" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.889762 4870 scope.go:117] "RemoveContainer" containerID="11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562" Mar 12 00:13:17 crc kubenswrapper[4870]: E0312 00:13:17.889561 4870 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189befaad86dfd74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 00:13:15.354987892 +0000 UTC m=+285.958404212,LastTimestamp:2026-03-12 00:13:15.354987892 +0000 UTC m=+285.958404212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 00:13:17 crc kubenswrapper[4870]: E0312 00:13:17.890360 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\": container with ID starting with 11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562 not found: ID does not exist" containerID="11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562" Mar 12 00:13:17 crc kubenswrapper[4870]: I0312 00:13:17.890412 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562"} err="failed to get container status \"11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\": rpc error: code = NotFound desc = could not find container \"11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562\": container with ID starting with 11ef0f9ed757bdb38c24eff520b56a7b9f75d104270a959d1b07701e7c28b562 not found: ID does not exist" Mar 12 00:13:18 crc kubenswrapper[4870]: I0312 00:13:18.119133 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 12 00:13:20 crc kubenswrapper[4870]: I0312 00:13:20.112811 4870 status_manager.go:851] "Failed to get status for pod" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:20 crc kubenswrapper[4870]: I0312 00:13:20.114351 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:20 crc kubenswrapper[4870]: I0312 00:13:20.115667 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:20 crc kubenswrapper[4870]: I0312 00:13:20.116724 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:20 crc kubenswrapper[4870]: E0312 00:13:20.149400 4870 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.129:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" volumeName="registry-storage" Mar 12 00:13:22 crc kubenswrapper[4870]: E0312 00:13:22.352032 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d6a8bb4_df10_46c3_91e6_826e501be09f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice/crio-df83565821410054a336bf3c21192e42145e4f3f3fde02334542dfbbf52cea36\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53febc79_03f6_4672_889c_818fa0b8d11d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b0ce3c_f432_48f4_81ed_62cf96995f8d.slice\": RecentStats: unable to find data in memory cache]" Mar 12 00:13:24 crc kubenswrapper[4870]: E0312 00:13:24.368707 4870 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:24 crc kubenswrapper[4870]: E0312 00:13:24.369696 4870 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:24 crc kubenswrapper[4870]: E0312 00:13:24.370407 4870 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:24 crc kubenswrapper[4870]: E0312 00:13:24.370858 4870 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:24 crc kubenswrapper[4870]: E0312 00:13:24.375080 4870 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:24 crc kubenswrapper[4870]: I0312 00:13:24.375476 4870 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 12 00:13:24 crc kubenswrapper[4870]: E0312 00:13:24.376017 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="200ms" Mar 12 00:13:24 crc kubenswrapper[4870]: E0312 00:13:24.577871 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="400ms" Mar 12 00:13:24 crc kubenswrapper[4870]: E0312 00:13:24.978991 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="800ms" Mar 12 00:13:25 crc kubenswrapper[4870]: E0312 00:13:25.780301 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="1.6s" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.104043 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.104797 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.105439 4870 status_manager.go:851] "Failed to get status for pod" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.105814 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.106336 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.128964 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.129476 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:26 crc kubenswrapper[4870]: E0312 00:13:26.129975 4870 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.130522 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:26 crc kubenswrapper[4870]: W0312 00:13:26.161922 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-c06af01ce098d5af88a19a079d79d0517d96eda084b332610437399492bbba9d WatchSource:0}: Error finding container c06af01ce098d5af88a19a079d79d0517d96eda084b332610437399492bbba9d: Status 404 returned error can't find the container with id c06af01ce098d5af88a19a079d79d0517d96eda084b332610437399492bbba9d Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.805182 4870 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="40a3112601e86c40730df1c6c41eadfa3c7efd022fea88eb186d83f2bbd52172" exitCode=0 Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.805251 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"40a3112601e86c40730df1c6c41eadfa3c7efd022fea88eb186d83f2bbd52172"} Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.805356 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c06af01ce098d5af88a19a079d79d0517d96eda084b332610437399492bbba9d"} Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.805819 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.805848 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.806269 4870 status_manager.go:851] "Failed to get status for pod" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:26 crc kubenswrapper[4870]: E0312 00:13:26.806460 4870 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.806956 4870 status_manager.go:851] "Failed to get status for pod" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" pod="openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7c675c78cc-xdz9r\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.807815 4870 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:26 crc kubenswrapper[4870]: I0312 00:13:26.808320 4870 status_manager.go:851] "Failed to get status for pod" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" pod="openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b94cb8b7-n5sdp\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:27 crc kubenswrapper[4870]: E0312 00:13:27.059602 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:13:27Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:13:27Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:13:27Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T00:13:27Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:27 crc kubenswrapper[4870]: E0312 00:13:27.060169 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:27 crc kubenswrapper[4870]: E0312 00:13:27.060658 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:27 crc kubenswrapper[4870]: E0312 00:13:27.061075 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:27 crc kubenswrapper[4870]: E0312 00:13:27.061536 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Mar 12 00:13:27 crc kubenswrapper[4870]: E0312 00:13:27.061563 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 00:13:27 crc kubenswrapper[4870]: I0312 00:13:27.841169 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"86df9981903618db2658d7052d4336928b64304b03081566abe5c5cb836c0d8c"} Mar 12 00:13:27 crc kubenswrapper[4870]: I0312 00:13:27.841215 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ac72916c4a2507964faafa16168db314932355f6e45f5078513ea20b8a553184"} Mar 12 00:13:27 crc kubenswrapper[4870]: I0312 00:13:27.841225 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a5439e00c69e6b97045f1c871a8913994dda2a4b991c1ce4ce2350c1f5047793"} Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.863530 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ccd7b7ac26c098e67cd0f887017ca0dd17b6bd7fbb279113eb186ca5a6552e85"} Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.863759 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"93dab411ab3d35b1aa5efebbf1e8372ce2533e0dd31d3090c0bc4580b3e05d1c"} Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.863890 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.863945 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.863980 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.868803 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.869348 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.869399 4870 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38" exitCode=1 Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.869427 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38"} Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.869914 4870 scope.go:117] "RemoveContainer" containerID="5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38" Mar 12 00:13:28 crc kubenswrapper[4870]: I0312 00:13:28.988485 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:13:29 crc kubenswrapper[4870]: I0312 00:13:29.877758 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 00:13:29 crc kubenswrapper[4870]: I0312 00:13:29.878227 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 12 00:13:29 crc kubenswrapper[4870]: I0312 00:13:29.878265 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1e5b1ea1e4dfcd23b4c14e8517cab342dc9c79668bdad21d06bcbdf68f399c90"} Mar 12 00:13:31 crc kubenswrapper[4870]: I0312 00:13:31.131574 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:31 crc kubenswrapper[4870]: I0312 00:13:31.132115 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:31 crc kubenswrapper[4870]: I0312 00:13:31.139118 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:33 crc kubenswrapper[4870]: I0312 00:13:33.876616 4870 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:33 crc kubenswrapper[4870]: I0312 00:13:33.906197 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:33 crc kubenswrapper[4870]: I0312 00:13:33.906226 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:33 crc kubenswrapper[4870]: I0312 00:13:33.911537 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:33 crc kubenswrapper[4870]: I0312 00:13:33.937090 4870 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="05cc920c-de14-4d10-9333-a951ed916cfe" Mar 12 00:13:34 crc kubenswrapper[4870]: I0312 00:13:34.543407 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:13:34 crc kubenswrapper[4870]: I0312 00:13:34.912680 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:34 crc kubenswrapper[4870]: I0312 00:13:34.912727 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:34 crc kubenswrapper[4870]: I0312 00:13:34.916959 4870 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="05cc920c-de14-4d10-9333-a951ed916cfe" Mar 12 00:13:36 crc kubenswrapper[4870]: I0312 00:13:36.711220 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:13:36 crc kubenswrapper[4870]: I0312 00:13:36.711444 4870 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 12 00:13:36 crc kubenswrapper[4870]: I0312 00:13:36.711654 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 12 00:13:40 crc kubenswrapper[4870]: I0312 00:13:40.826809 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 12 00:13:41 crc kubenswrapper[4870]: I0312 00:13:41.504639 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 00:13:42 crc kubenswrapper[4870]: I0312 00:13:42.469617 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.048914 4870 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.056284 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=28.056254829 podStartE2EDuration="28.056254829s" podCreationTimestamp="2026-03-12 00:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:13:33.913426818 +0000 UTC m=+304.516843128" watchObservedRunningTime="2026-03-12 00:13:43.056254829 +0000 UTC m=+313.659671179" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.058716 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b94cb8b7-n5sdp","openshift-kube-apiserver/kube-apiserver-crc","openshift-route-controller-manager/route-controller-manager-7c675c78cc-xdz9r"] Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.058812 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d","openshift-kube-apiserver/kube-apiserver-crc","openshift-controller-manager/controller-manager-bb9f86d54-rpv5x"] Mar 12 00:13:43 crc kubenswrapper[4870]: E0312 00:13:43.059108 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" containerName="installer" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.059189 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" containerName="installer" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.059396 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="29fd0efd-a422-4b2d-b22c-f2bda94a368d" containerName="installer" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.060210 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.064501 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.066019 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.066080 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9e379442-f878-4e5e-beba-10a7caa4107b" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.069611 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.070062 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.070582 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.070781 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.070880 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.071267 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.071614 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.071934 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.071981 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.076735 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.077668 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.078003 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.078346 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.091327 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.139850 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbe6072d-972f-41be-95d5-aa2726ade6cd-client-ca\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.139923 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ea6d728-7bd6-45b5-acf1-f433e636df49-serving-cert\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.140008 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ea6d728-7bd6-45b5-acf1-f433e636df49-client-ca\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.140108 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe6072d-972f-41be-95d5-aa2726ade6cd-config\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.140213 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe6072d-972f-41be-95d5-aa2726ade6cd-serving-cert\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.140256 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea6d728-7bd6-45b5-acf1-f433e636df49-config\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.140341 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ea6d728-7bd6-45b5-acf1-f433e636df49-proxy-ca-bundles\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.140396 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b8ps\" (UniqueName: \"kubernetes.io/projected/3ea6d728-7bd6-45b5-acf1-f433e636df49-kube-api-access-6b8ps\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.140458 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnf8b\" (UniqueName: \"kubernetes.io/projected/bbe6072d-972f-41be-95d5-aa2726ade6cd-kube-api-access-dnf8b\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.242548 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe6072d-972f-41be-95d5-aa2726ade6cd-config\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.242642 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe6072d-972f-41be-95d5-aa2726ade6cd-serving-cert\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.242696 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea6d728-7bd6-45b5-acf1-f433e636df49-config\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.242767 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ea6d728-7bd6-45b5-acf1-f433e636df49-proxy-ca-bundles\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.242825 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b8ps\" (UniqueName: \"kubernetes.io/projected/3ea6d728-7bd6-45b5-acf1-f433e636df49-kube-api-access-6b8ps\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.242881 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnf8b\" (UniqueName: \"kubernetes.io/projected/bbe6072d-972f-41be-95d5-aa2726ade6cd-kube-api-access-dnf8b\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.242937 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbe6072d-972f-41be-95d5-aa2726ade6cd-client-ca\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.242971 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ea6d728-7bd6-45b5-acf1-f433e636df49-serving-cert\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.243019 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ea6d728-7bd6-45b5-acf1-f433e636df49-client-ca\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.245747 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbe6072d-972f-41be-95d5-aa2726ade6cd-client-ca\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.245644 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ea6d728-7bd6-45b5-acf1-f433e636df49-proxy-ca-bundles\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.246014 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ea6d728-7bd6-45b5-acf1-f433e636df49-client-ca\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.247046 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe6072d-972f-41be-95d5-aa2726ade6cd-config\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.247496 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea6d728-7bd6-45b5-acf1-f433e636df49-config\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.254541 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe6072d-972f-41be-95d5-aa2726ade6cd-serving-cert\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.257503 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ea6d728-7bd6-45b5-acf1-f433e636df49-serving-cert\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.271297 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b8ps\" (UniqueName: \"kubernetes.io/projected/3ea6d728-7bd6-45b5-acf1-f433e636df49-kube-api-access-6b8ps\") pod \"controller-manager-bb9f86d54-rpv5x\" (UID: \"3ea6d728-7bd6-45b5-acf1-f433e636df49\") " pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.278314 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnf8b\" (UniqueName: \"kubernetes.io/projected/bbe6072d-972f-41be-95d5-aa2726ade6cd-kube-api-access-dnf8b\") pod \"route-controller-manager-5c5bdb7658-hz66d\" (UID: \"bbe6072d-972f-41be-95d5-aa2726ade6cd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.404088 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.414641 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:43 crc kubenswrapper[4870]: I0312 00:13:43.460595 4870 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 00:13:44 crc kubenswrapper[4870]: I0312 00:13:44.117199 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21602ca-9c73-4523-bb58-e165561d8d43" path="/var/lib/kubelet/pods/a21602ca-9c73-4523-bb58-e165561d8d43/volumes" Mar 12 00:13:44 crc kubenswrapper[4870]: I0312 00:13:44.118478 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21f0998-2c71-4243-91a2-7478cbfeea60" path="/var/lib/kubelet/pods/c21f0998-2c71-4243-91a2-7478cbfeea60/volumes" Mar 12 00:13:44 crc kubenswrapper[4870]: I0312 00:13:44.214781 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 00:13:44 crc kubenswrapper[4870]: I0312 00:13:44.327897 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 00:13:44 crc kubenswrapper[4870]: I0312 00:13:44.690727 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 00:13:45 crc kubenswrapper[4870]: I0312 00:13:45.073677 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 12 00:13:45 crc kubenswrapper[4870]: I0312 00:13:45.151686 4870 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 00:13:45 crc kubenswrapper[4870]: I0312 00:13:45.152001 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77" gracePeriod=5 Mar 12 00:13:45 crc kubenswrapper[4870]: I0312 00:13:45.511733 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 12 00:13:45 crc kubenswrapper[4870]: I0312 00:13:45.530747 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 00:13:45 crc kubenswrapper[4870]: I0312 00:13:45.837331 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 00:13:45 crc kubenswrapper[4870]: I0312 00:13:45.869422 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 00:13:46 crc kubenswrapper[4870]: I0312 00:13:46.664583 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 12 00:13:46 crc kubenswrapper[4870]: I0312 00:13:46.711662 4870 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 12 00:13:46 crc kubenswrapper[4870]: I0312 00:13:46.711737 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 12 00:13:47 crc kubenswrapper[4870]: I0312 00:13:47.205904 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 00:13:47 crc kubenswrapper[4870]: I0312 00:13:47.233254 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 12 00:13:47 crc kubenswrapper[4870]: I0312 00:13:47.530011 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 00:13:47 crc kubenswrapper[4870]: I0312 00:13:47.671907 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 00:13:47 crc kubenswrapper[4870]: I0312 00:13:47.768847 4870 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 12 00:13:47 crc kubenswrapper[4870]: I0312 00:13:47.782485 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 12 00:13:47 crc kubenswrapper[4870]: I0312 00:13:47.807498 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 00:13:47 crc kubenswrapper[4870]: I0312 00:13:47.838957 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 12 00:13:47 crc kubenswrapper[4870]: I0312 00:13:47.917063 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.290954 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.377878 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.418974 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.420842 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.485536 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.536587 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.605925 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.633355 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.668740 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.677793 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.965204 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 00:13:48 crc kubenswrapper[4870]: I0312 00:13:48.988784 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.026109 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.100585 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.122105 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.161059 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.204438 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.206582 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.276275 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.441569 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.518934 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.645858 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.700175 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.752782 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.811170 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 12 00:13:49 crc kubenswrapper[4870]: I0312 00:13:49.982895 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.048292 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.161219 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.194205 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.246602 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.291414 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.328788 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.369577 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.433342 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.511520 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.587427 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.671068 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.700822 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.717469 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.717550 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.760548 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.760630 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.760807 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.760858 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.760944 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.761536 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.761625 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.761686 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.762365 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.769040 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.807770 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.862651 4870 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.862687 4870 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.862700 4870 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.862711 4870 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.862723 4870 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.901315 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 00:13:50 crc kubenswrapper[4870]: I0312 00:13:50.924474 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.021075 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.021199 4870 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77" exitCode=137 Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.021271 4870 scope.go:117] "RemoveContainer" containerID="3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.021344 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.049283 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.056260 4870 scope.go:117] "RemoveContainer" containerID="3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77" Mar 12 00:13:51 crc kubenswrapper[4870]: E0312 00:13:51.056880 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77\": container with ID starting with 3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77 not found: ID does not exist" containerID="3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.056952 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77"} err="failed to get container status \"3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77\": rpc error: code = NotFound desc = could not find container \"3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77\": container with ID starting with 3d3dcfa1de40cdefabb7b39ee36dee30e09ccb3910ad7998ceecd55d17c35d77 not found: ID does not exist" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.107257 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.211648 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.233194 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.240569 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.288849 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.320335 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.415979 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.434079 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.458061 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.459188 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.529197 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.555344 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.559462 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.577247 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.668091 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.678703 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.693261 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.700167 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.783410 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.828345 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.839915 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.849089 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.906104 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 00:13:51 crc kubenswrapper[4870]: I0312 00:13:51.970745 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.005566 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.022593 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.090732 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.094854 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.109476 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.115449 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.115929 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.116918 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.118008 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.126540 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.126517334 podStartE2EDuration="19.126517334s" podCreationTimestamp="2026-03-12 00:13:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:13:43.14369317 +0000 UTC m=+313.747109490" watchObservedRunningTime="2026-03-12 00:13:52.126517334 +0000 UTC m=+322.729933694" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.128667 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.128707 4870 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="0c85efa5-791c-4607-8632-c7c37aef6925" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.135504 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.135588 4870 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="0c85efa5-791c-4607-8632-c7c37aef6925" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.184480 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.185995 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.311170 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.353739 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.388917 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.504912 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.514184 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.565279 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.681685 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.693702 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.713889 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.769571 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.776016 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.869531 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.901816 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 00:13:52 crc kubenswrapper[4870]: I0312 00:13:52.907725 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.028119 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.035539 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.040690 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.104100 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.147653 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.179702 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.187449 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.256444 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.333743 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.339293 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.401702 4870 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.469994 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.527094 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.582696 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.638543 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.653212 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.699134 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.731171 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.757761 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 00:13:53 crc kubenswrapper[4870]: I0312 00:13:53.879808 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.014228 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.053845 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.105919 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.182333 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.191069 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.209052 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.216066 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.236960 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.257252 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.259502 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.353002 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.369307 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.377687 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.428223 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.496182 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.529455 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.536579 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 00:13:54 crc kubenswrapper[4870]: I0312 00:13:54.762060 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.138395 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.150598 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.248245 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.294683 4870 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.333923 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.356531 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.393675 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.460047 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.473534 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.576235 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.653297 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.789765 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.837299 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.887187 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.901792 4870 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 00:13:55 crc kubenswrapper[4870]: I0312 00:13:55.973415 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.070459 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.400874 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.454361 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.577082 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.675265 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.680739 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.691412 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.711678 4870 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.711777 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.711875 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.712930 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"1e5b1ea1e4dfcd23b4c14e8517cab342dc9c79668bdad21d06bcbdf68f399c90"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.713237 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://1e5b1ea1e4dfcd23b4c14e8517cab342dc9c79668bdad21d06bcbdf68f399c90" gracePeriod=30 Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.725741 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.774207 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.840484 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.851751 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 00:13:56 crc kubenswrapper[4870]: I0312 00:13:56.968708 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.054745 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.056383 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.074436 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.092857 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.108909 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.142047 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.181035 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.287778 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.356774 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.368766 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.427466 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.483195 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.506825 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.528601 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.576635 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.582242 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.670230 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.728824 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.769318 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.865634 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 12 00:13:57 crc kubenswrapper[4870]: I0312 00:13:57.884905 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.037062 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.062602 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bb9f86d54-rpv5x"] Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.068871 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d"] Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.082220 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.182510 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.360616 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.402647 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d"] Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.409483 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.427939 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.460012 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bb9f86d54-rpv5x"] Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.478093 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.591735 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.614474 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.624358 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.683627 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.795866 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.939022 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 00:13:58 crc kubenswrapper[4870]: I0312 00:13:58.976068 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.008644 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.041299 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.083090 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" event={"ID":"bbe6072d-972f-41be-95d5-aa2726ade6cd","Type":"ContainerStarted","Data":"7a3c6b43fe20dcb0b6675d35f278a1b98c7ddd86d7232cbd4a12e503fd422fe2"} Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.083184 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" event={"ID":"bbe6072d-972f-41be-95d5-aa2726ade6cd","Type":"ContainerStarted","Data":"e890726feb1eb7c59a7304f7a291e21e33da2978236e2bda716a933e6a31a1cd"} Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.083361 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.085850 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" event={"ID":"3ea6d728-7bd6-45b5-acf1-f433e636df49","Type":"ContainerStarted","Data":"e9af571b9a629aa3143ccbebb91a85798edfabf3e1e9f48963b585be761e6816"} Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.086242 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.086263 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" event={"ID":"3ea6d728-7bd6-45b5-acf1-f433e636df49","Type":"ContainerStarted","Data":"3b6f7cc0f20835f8567ce1b06abb3c3377dbeb8a0c3aec83805413bd7171f550"} Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.095489 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.111576 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" podStartSLOduration=45.111552114 podStartE2EDuration="45.111552114s" podCreationTimestamp="2026-03-12 00:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:13:59.111427611 +0000 UTC m=+329.714843961" watchObservedRunningTime="2026-03-12 00:13:59.111552114 +0000 UTC m=+329.714968464" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.135060 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bb9f86d54-rpv5x" podStartSLOduration=45.135038487 podStartE2EDuration="45.135038487s" podCreationTimestamp="2026-03-12 00:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:13:59.133726361 +0000 UTC m=+329.737142711" watchObservedRunningTime="2026-03-12 00:13:59.135038487 +0000 UTC m=+329.738454807" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.183715 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.203335 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.225896 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.564519 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.712010 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.842583 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.868542 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 00:13:59 crc kubenswrapper[4870]: I0312 00:13:59.909949 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 00:14:00 crc kubenswrapper[4870]: I0312 00:14:00.083376 4870 patch_prober.go:28] interesting pod/route-controller-manager-5c5bdb7658-hz66d container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 00:14:00 crc kubenswrapper[4870]: I0312 00:14:00.083473 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" podUID="bbe6072d-972f-41be-95d5-aa2726ade6cd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 00:14:00 crc kubenswrapper[4870]: I0312 00:14:00.107282 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 00:14:00 crc kubenswrapper[4870]: I0312 00:14:00.229063 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 00:14:00 crc kubenswrapper[4870]: I0312 00:14:00.504392 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 00:14:00 crc kubenswrapper[4870]: I0312 00:14:00.930759 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 12 00:14:01 crc kubenswrapper[4870]: I0312 00:14:01.007453 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 00:14:01 crc kubenswrapper[4870]: I0312 00:14:01.044419 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 12 00:14:01 crc kubenswrapper[4870]: I0312 00:14:01.093733 4870 patch_prober.go:28] interesting pod/route-controller-manager-5c5bdb7658-hz66d container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 00:14:01 crc kubenswrapper[4870]: I0312 00:14:01.093800 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" podUID="bbe6072d-972f-41be-95d5-aa2726ade6cd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 00:14:01 crc kubenswrapper[4870]: I0312 00:14:01.238639 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 00:14:01 crc kubenswrapper[4870]: I0312 00:14:01.243950 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 12 00:14:01 crc kubenswrapper[4870]: I0312 00:14:01.289320 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 00:14:01 crc kubenswrapper[4870]: I0312 00:14:01.426814 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 12 00:14:01 crc kubenswrapper[4870]: I0312 00:14:01.709359 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 00:14:01 crc kubenswrapper[4870]: I0312 00:14:01.972413 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 00:14:02 crc kubenswrapper[4870]: I0312 00:14:02.005446 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 12 00:14:02 crc kubenswrapper[4870]: I0312 00:14:02.367333 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 00:14:02 crc kubenswrapper[4870]: I0312 00:14:02.921326 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 00:14:04 crc kubenswrapper[4870]: I0312 00:14:04.271293 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c5bdb7658-hz66d" Mar 12 00:14:04 crc kubenswrapper[4870]: I0312 00:14:04.418973 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 00:14:26 crc kubenswrapper[4870]: I0312 00:14:26.262617 4870 generic.go:334] "Generic (PLEG): container finished" podID="596347fa-d520-46af-b25c-860d7c0d91a4" containerID="d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e" exitCode=0 Mar 12 00:14:26 crc kubenswrapper[4870]: I0312 00:14:26.262745 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" event={"ID":"596347fa-d520-46af-b25c-860d7c0d91a4","Type":"ContainerDied","Data":"d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e"} Mar 12 00:14:26 crc kubenswrapper[4870]: I0312 00:14:26.263738 4870 scope.go:117] "RemoveContainer" containerID="d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e" Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.272733 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" event={"ID":"596347fa-d520-46af-b25c-860d7c0d91a4","Type":"ContainerStarted","Data":"a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b"} Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.273793 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.275566 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.277067 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.280264 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.283393 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.283467 4870 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1e5b1ea1e4dfcd23b4c14e8517cab342dc9c79668bdad21d06bcbdf68f399c90" exitCode=137 Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.283513 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1e5b1ea1e4dfcd23b4c14e8517cab342dc9c79668bdad21d06bcbdf68f399c90"} Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.283555 4870 scope.go:117] "RemoveContainer" containerID="5e91132a8abb1c48422a3f3cb1f610e6f8115a0068d9a054daa31787583e4e38" Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.519404 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m78hv"] Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.520094 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m78hv" podUID="61a02593-b52d-470c-967d-565b6fafde45" containerName="registry-server" containerID="cri-o://2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2" gracePeriod=30 Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.531101 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qfz5g"] Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.531604 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qfz5g" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerName="registry-server" containerID="cri-o://8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab" gracePeriod=30 Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.545662 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6znk2"] Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.561835 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qt622"] Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.562134 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qt622" podUID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerName="registry-server" containerID="cri-o://e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3" gracePeriod=30 Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.571447 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-78vzj"] Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.571672 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-78vzj" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerName="registry-server" containerID="cri-o://458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3" gracePeriod=30 Mar 12 00:14:27 crc kubenswrapper[4870]: E0312 00:14:27.779942 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3 is running failed: container process not found" containerID="e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 00:14:27 crc kubenswrapper[4870]: E0312 00:14:27.780354 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3 is running failed: container process not found" containerID="e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 00:14:27 crc kubenswrapper[4870]: E0312 00:14:27.780571 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3 is running failed: container process not found" containerID="e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 00:14:27 crc kubenswrapper[4870]: E0312 00:14:27.780599 4870 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-qt622" podUID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerName="registry-server" Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.967627 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:14:27 crc kubenswrapper[4870]: I0312 00:14:27.971782 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.006466 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.018004 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064573 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk9lr\" (UniqueName: \"kubernetes.io/projected/633cb50d-ccf5-4e3c-a40f-05581c94950e-kube-api-access-sk9lr\") pod \"633cb50d-ccf5-4e3c-a40f-05581c94950e\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064611 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w975\" (UniqueName: \"kubernetes.io/projected/5c8b915a-17ad-4b09-812f-dea6471a117c-kube-api-access-7w975\") pod \"5c8b915a-17ad-4b09-812f-dea6471a117c\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064644 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-catalog-content\") pod \"61a02593-b52d-470c-967d-565b6fafde45\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064670 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-catalog-content\") pod \"633cb50d-ccf5-4e3c-a40f-05581c94950e\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064685 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-catalog-content\") pod \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064698 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh8sk\" (UniqueName: \"kubernetes.io/projected/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-kube-api-access-zh8sk\") pod \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064730 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5v97\" (UniqueName: \"kubernetes.io/projected/61a02593-b52d-470c-967d-565b6fafde45-kube-api-access-q5v97\") pod \"61a02593-b52d-470c-967d-565b6fafde45\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064748 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-catalog-content\") pod \"5c8b915a-17ad-4b09-812f-dea6471a117c\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064765 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-utilities\") pod \"633cb50d-ccf5-4e3c-a40f-05581c94950e\" (UID: \"633cb50d-ccf5-4e3c-a40f-05581c94950e\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064786 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-utilities\") pod \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\" (UID: \"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064807 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-utilities\") pod \"5c8b915a-17ad-4b09-812f-dea6471a117c\" (UID: \"5c8b915a-17ad-4b09-812f-dea6471a117c\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.064855 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-utilities\") pod \"61a02593-b52d-470c-967d-565b6fafde45\" (UID: \"61a02593-b52d-470c-967d-565b6fafde45\") " Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.065834 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-utilities" (OuterVolumeSpecName: "utilities") pod "d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" (UID: "d4b05c20-2025-4ce8-9c10-a31f3e0b20e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.066420 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-utilities" (OuterVolumeSpecName: "utilities") pod "633cb50d-ccf5-4e3c-a40f-05581c94950e" (UID: "633cb50d-ccf5-4e3c-a40f-05581c94950e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.066507 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-utilities" (OuterVolumeSpecName: "utilities") pod "5c8b915a-17ad-4b09-812f-dea6471a117c" (UID: "5c8b915a-17ad-4b09-812f-dea6471a117c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.066854 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-utilities" (OuterVolumeSpecName: "utilities") pod "61a02593-b52d-470c-967d-565b6fafde45" (UID: "61a02593-b52d-470c-967d-565b6fafde45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.069989 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/633cb50d-ccf5-4e3c-a40f-05581c94950e-kube-api-access-sk9lr" (OuterVolumeSpecName: "kube-api-access-sk9lr") pod "633cb50d-ccf5-4e3c-a40f-05581c94950e" (UID: "633cb50d-ccf5-4e3c-a40f-05581c94950e"). InnerVolumeSpecName "kube-api-access-sk9lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.071189 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-kube-api-access-zh8sk" (OuterVolumeSpecName: "kube-api-access-zh8sk") pod "d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" (UID: "d4b05c20-2025-4ce8-9c10-a31f3e0b20e1"). InnerVolumeSpecName "kube-api-access-zh8sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.071453 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a02593-b52d-470c-967d-565b6fafde45-kube-api-access-q5v97" (OuterVolumeSpecName: "kube-api-access-q5v97") pod "61a02593-b52d-470c-967d-565b6fafde45" (UID: "61a02593-b52d-470c-967d-565b6fafde45"). InnerVolumeSpecName "kube-api-access-q5v97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.085417 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8b915a-17ad-4b09-812f-dea6471a117c-kube-api-access-7w975" (OuterVolumeSpecName: "kube-api-access-7w975") pod "5c8b915a-17ad-4b09-812f-dea6471a117c" (UID: "5c8b915a-17ad-4b09-812f-dea6471a117c"). InnerVolumeSpecName "kube-api-access-7w975". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.105034 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c8b915a-17ad-4b09-812f-dea6471a117c" (UID: "5c8b915a-17ad-4b09-812f-dea6471a117c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.136220 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61a02593-b52d-470c-967d-565b6fafde45" (UID: "61a02593-b52d-470c-967d-565b6fafde45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.158786 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "633cb50d-ccf5-4e3c-a40f-05581c94950e" (UID: "633cb50d-ccf5-4e3c-a40f-05581c94950e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165464 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165494 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165506 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh8sk\" (UniqueName: \"kubernetes.io/projected/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-kube-api-access-zh8sk\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165523 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5v97\" (UniqueName: \"kubernetes.io/projected/61a02593-b52d-470c-967d-565b6fafde45-kube-api-access-q5v97\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165535 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165549 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/633cb50d-ccf5-4e3c-a40f-05581c94950e-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165560 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165571 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8b915a-17ad-4b09-812f-dea6471a117c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165582 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a02593-b52d-470c-967d-565b6fafde45-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165593 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk9lr\" (UniqueName: \"kubernetes.io/projected/633cb50d-ccf5-4e3c-a40f-05581c94950e-kube-api-access-sk9lr\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.165605 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w975\" (UniqueName: \"kubernetes.io/projected/5c8b915a-17ad-4b09-812f-dea6471a117c-kube-api-access-7w975\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.201637 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" (UID: "d4b05c20-2025-4ce8-9c10-a31f3e0b20e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.267445 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.291827 4870 generic.go:334] "Generic (PLEG): container finished" podID="61a02593-b52d-470c-967d-565b6fafde45" containerID="2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2" exitCode=0 Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.291904 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m78hv" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.291919 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m78hv" event={"ID":"61a02593-b52d-470c-967d-565b6fafde45","Type":"ContainerDied","Data":"2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2"} Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.292014 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m78hv" event={"ID":"61a02593-b52d-470c-967d-565b6fafde45","Type":"ContainerDied","Data":"636dd20bce29b2c8c2eaaf6858e2b23a112ccdd774e72cdb594503c057c5f8bf"} Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.292060 4870 scope.go:117] "RemoveContainer" containerID="2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.297942 4870 generic.go:334] "Generic (PLEG): container finished" podID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerID="458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3" exitCode=0 Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.298115 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-78vzj" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.298134 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-78vzj" event={"ID":"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1","Type":"ContainerDied","Data":"458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3"} Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.298339 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-78vzj" event={"ID":"d4b05c20-2025-4ce8-9c10-a31f3e0b20e1","Type":"ContainerDied","Data":"0c5688b12997332a6343c2f23ca5a1c798282eb9b6b5355bdd6a0c3c17e24a60"} Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.300740 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.302427 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.302484 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c14a7055c5fbc090813252a62db7f5d4c11c551cc5f6a9fc6392abf478705b2e"} Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.307415 4870 generic.go:334] "Generic (PLEG): container finished" podID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerID="8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab" exitCode=0 Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.307461 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qfz5g" event={"ID":"633cb50d-ccf5-4e3c-a40f-05581c94950e","Type":"ContainerDied","Data":"8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab"} Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.307476 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qfz5g" event={"ID":"633cb50d-ccf5-4e3c-a40f-05581c94950e","Type":"ContainerDied","Data":"6ef678e27eb1eaa09e4440b9184ccaedf27c60553db3cce52876ec2a4191793f"} Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.307529 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qfz5g" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.311287 4870 scope.go:117] "RemoveContainer" containerID="8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.314016 4870 generic.go:334] "Generic (PLEG): container finished" podID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerID="e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3" exitCode=0 Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.314678 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qt622" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.314970 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt622" event={"ID":"5c8b915a-17ad-4b09-812f-dea6471a117c","Type":"ContainerDied","Data":"e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3"} Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.315006 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt622" event={"ID":"5c8b915a-17ad-4b09-812f-dea6471a117c","Type":"ContainerDied","Data":"58664ed2efb651d27408c09695644532b06548244f67df79c1c84099cf49ddf3"} Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.335257 4870 scope.go:117] "RemoveContainer" containerID="8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.347501 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-78vzj"] Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.356342 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-78vzj"] Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.360428 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m78hv"] Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.362242 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m78hv"] Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.376995 4870 scope.go:117] "RemoveContainer" containerID="2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.377530 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2\": container with ID starting with 2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2 not found: ID does not exist" containerID="2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.377584 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2"} err="failed to get container status \"2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2\": rpc error: code = NotFound desc = could not find container \"2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2\": container with ID starting with 2d68e9cc23b339b674622f2d0be8c1ed43e7e8f6f0d05550b4495be80e4b11c2 not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.377621 4870 scope.go:117] "RemoveContainer" containerID="8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.377901 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce\": container with ID starting with 8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce not found: ID does not exist" containerID="8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.377923 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce"} err="failed to get container status \"8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce\": rpc error: code = NotFound desc = could not find container \"8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce\": container with ID starting with 8b46c05f4c6df951978a2d9b046d32e7c7aee2ae8600c9a6b5012998a21c39ce not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.377937 4870 scope.go:117] "RemoveContainer" containerID="8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.378095 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6\": container with ID starting with 8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6 not found: ID does not exist" containerID="8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.378119 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6"} err="failed to get container status \"8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6\": rpc error: code = NotFound desc = could not find container \"8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6\": container with ID starting with 8f6e8aae1641034d52560a1e2296a2048270722178bba6e072320250e06e12b6 not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.378134 4870 scope.go:117] "RemoveContainer" containerID="458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.381269 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qt622"] Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.388158 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qt622"] Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.391194 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qfz5g"] Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.393850 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qfz5g"] Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.397617 4870 scope.go:117] "RemoveContainer" containerID="5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.410646 4870 scope.go:117] "RemoveContainer" containerID="4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.427505 4870 scope.go:117] "RemoveContainer" containerID="458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.427838 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3\": container with ID starting with 458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3 not found: ID does not exist" containerID="458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.427875 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3"} err="failed to get container status \"458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3\": rpc error: code = NotFound desc = could not find container \"458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3\": container with ID starting with 458a7551760263a83864be5af698518b578c43ead4ba9d83b5d546c4f501c6c3 not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.427900 4870 scope.go:117] "RemoveContainer" containerID="5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.428202 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8\": container with ID starting with 5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8 not found: ID does not exist" containerID="5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.428225 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8"} err="failed to get container status \"5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8\": rpc error: code = NotFound desc = could not find container \"5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8\": container with ID starting with 5d918e7b8b0c06a3e595bfa517bf627e22ce8337012ff1e7713b94cf710242a8 not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.428241 4870 scope.go:117] "RemoveContainer" containerID="4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.428551 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195\": container with ID starting with 4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195 not found: ID does not exist" containerID="4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.428597 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195"} err="failed to get container status \"4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195\": rpc error: code = NotFound desc = could not find container \"4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195\": container with ID starting with 4ab239aa8ddb1cdaf98bbf353d6580864293297096d6355ddb809f2d99b27195 not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.428615 4870 scope.go:117] "RemoveContainer" containerID="8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.442258 4870 scope.go:117] "RemoveContainer" containerID="e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.457489 4870 scope.go:117] "RemoveContainer" containerID="1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.471499 4870 scope.go:117] "RemoveContainer" containerID="8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.471888 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab\": container with ID starting with 8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab not found: ID does not exist" containerID="8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.471946 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab"} err="failed to get container status \"8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab\": rpc error: code = NotFound desc = could not find container \"8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab\": container with ID starting with 8307591c6d79f7a6fa9b2d10d72c1c6ca27a40fcff618a2f57d09e591c6a4cab not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.471970 4870 scope.go:117] "RemoveContainer" containerID="e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.472233 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1\": container with ID starting with e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1 not found: ID does not exist" containerID="e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.472253 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1"} err="failed to get container status \"e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1\": rpc error: code = NotFound desc = could not find container \"e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1\": container with ID starting with e60ed4d50cfeb12f2cb87635ffe08c05b391dcd73bdf2d33e966696dfb20ffb1 not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.472269 4870 scope.go:117] "RemoveContainer" containerID="1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.472703 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c\": container with ID starting with 1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c not found: ID does not exist" containerID="1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.472725 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c"} err="failed to get container status \"1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c\": rpc error: code = NotFound desc = could not find container \"1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c\": container with ID starting with 1129c1dcd46a2f49ec33c6dae9d5ecb15df03cfa3a4430d7d08f5177743c723c not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.472741 4870 scope.go:117] "RemoveContainer" containerID="e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.484625 4870 scope.go:117] "RemoveContainer" containerID="ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.496801 4870 scope.go:117] "RemoveContainer" containerID="8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.508503 4870 scope.go:117] "RemoveContainer" containerID="e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.508813 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3\": container with ID starting with e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3 not found: ID does not exist" containerID="e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.508845 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3"} err="failed to get container status \"e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3\": rpc error: code = NotFound desc = could not find container \"e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3\": container with ID starting with e8c60e70a5239fe339b506e023f58429c3010b8f5548175b0007dfda644e66a3 not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.508873 4870 scope.go:117] "RemoveContainer" containerID="ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.509067 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c\": container with ID starting with ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c not found: ID does not exist" containerID="ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.509090 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c"} err="failed to get container status \"ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c\": rpc error: code = NotFound desc = could not find container \"ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c\": container with ID starting with ddb4668c99e23cdee8b4fe2f82795e75c79ef384045cbb323351f749f86c520c not found: ID does not exist" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.509107 4870 scope.go:117] "RemoveContainer" containerID="8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8" Mar 12 00:14:28 crc kubenswrapper[4870]: E0312 00:14:28.509311 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8\": container with ID starting with 8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8 not found: ID does not exist" containerID="8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8" Mar 12 00:14:28 crc kubenswrapper[4870]: I0312 00:14:28.509335 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8"} err="failed to get container status \"8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8\": rpc error: code = NotFound desc = could not find container \"8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8\": container with ID starting with 8ff6a02de33b42efe5383e82c9da000835d82b3c999b9534f703a2baa4ca89a8 not found: ID does not exist" Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.324909 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" containerName="marketplace-operator" containerID="cri-o://a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b" gracePeriod=30 Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.682351 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.786865 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-trusted-ca\") pod \"596347fa-d520-46af-b25c-860d7c0d91a4\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.786911 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-operator-metrics\") pod \"596347fa-d520-46af-b25c-860d7c0d91a4\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.786940 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zmpm\" (UniqueName: \"kubernetes.io/projected/596347fa-d520-46af-b25c-860d7c0d91a4-kube-api-access-7zmpm\") pod \"596347fa-d520-46af-b25c-860d7c0d91a4\" (UID: \"596347fa-d520-46af-b25c-860d7c0d91a4\") " Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.787426 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "596347fa-d520-46af-b25c-860d7c0d91a4" (UID: "596347fa-d520-46af-b25c-860d7c0d91a4"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.792024 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596347fa-d520-46af-b25c-860d7c0d91a4-kube-api-access-7zmpm" (OuterVolumeSpecName: "kube-api-access-7zmpm") pod "596347fa-d520-46af-b25c-860d7c0d91a4" (UID: "596347fa-d520-46af-b25c-860d7c0d91a4"). InnerVolumeSpecName "kube-api-access-7zmpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.792446 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "596347fa-d520-46af-b25c-860d7c0d91a4" (UID: "596347fa-d520-46af-b25c-860d7c0d91a4"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.890955 4870 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.891002 4870 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/596347fa-d520-46af-b25c-860d7c0d91a4-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:29 crc kubenswrapper[4870]: I0312 00:14:29.891017 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zmpm\" (UniqueName: \"kubernetes.io/projected/596347fa-d520-46af-b25c-860d7c0d91a4-kube-api-access-7zmpm\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.126317 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8b915a-17ad-4b09-812f-dea6471a117c" path="/var/lib/kubelet/pods/5c8b915a-17ad-4b09-812f-dea6471a117c/volumes" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.127518 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a02593-b52d-470c-967d-565b6fafde45" path="/var/lib/kubelet/pods/61a02593-b52d-470c-967d-565b6fafde45/volumes" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.128667 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" path="/var/lib/kubelet/pods/633cb50d-ccf5-4e3c-a40f-05581c94950e/volumes" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.130204 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" path="/var/lib/kubelet/pods/d4b05c20-2025-4ce8-9c10-a31f3e0b20e1/volumes" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.331908 4870 generic.go:334] "Generic (PLEG): container finished" podID="596347fa-d520-46af-b25c-860d7c0d91a4" containerID="a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b" exitCode=0 Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.331953 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" event={"ID":"596347fa-d520-46af-b25c-860d7c0d91a4","Type":"ContainerDied","Data":"a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b"} Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.331979 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" event={"ID":"596347fa-d520-46af-b25c-860d7c0d91a4","Type":"ContainerDied","Data":"5185e8f1811f0dc86275c0f9b8c957077869b48365fc53a5f642dc87f7109f46"} Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.331996 4870 scope.go:117] "RemoveContainer" containerID="a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.332096 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6znk2" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.361830 4870 scope.go:117] "RemoveContainer" containerID="d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.363480 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6znk2"] Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.368181 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6znk2"] Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.386226 4870 scope.go:117] "RemoveContainer" containerID="a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b" Mar 12 00:14:30 crc kubenswrapper[4870]: E0312 00:14:30.387478 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b\": container with ID starting with a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b not found: ID does not exist" containerID="a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.387521 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b"} err="failed to get container status \"a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b\": rpc error: code = NotFound desc = could not find container \"a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b\": container with ID starting with a743645361ccc0f91c899b6cd86294f8dbf9975b3244597a32d1302eabf0d55b not found: ID does not exist" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.387551 4870 scope.go:117] "RemoveContainer" containerID="d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e" Mar 12 00:14:30 crc kubenswrapper[4870]: E0312 00:14:30.387872 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e\": container with ID starting with d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e not found: ID does not exist" containerID="d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e" Mar 12 00:14:30 crc kubenswrapper[4870]: I0312 00:14:30.387888 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e"} err="failed to get container status \"d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e\": rpc error: code = NotFound desc = could not find container \"d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e\": container with ID starting with d346106ac7c53440f9322476125d101256ab839582fc294e9c32ecbcf5f5d13e not found: ID does not exist" Mar 12 00:14:32 crc kubenswrapper[4870]: I0312 00:14:32.119507 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" path="/var/lib/kubelet/pods/596347fa-d520-46af-b25c-860d7c0d91a4/volumes" Mar 12 00:14:34 crc kubenswrapper[4870]: I0312 00:14:34.542881 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:14:36 crc kubenswrapper[4870]: I0312 00:14:36.711201 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:14:36 crc kubenswrapper[4870]: I0312 00:14:36.715435 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:14:37 crc kubenswrapper[4870]: I0312 00:14:37.382745 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 00:14:47 crc kubenswrapper[4870]: I0312 00:14:47.594682 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:14:47 crc kubenswrapper[4870]: I0312 00:14:47.595264 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.349378 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-72wh5"] Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.349879 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.349890 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.349910 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.349916 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.349926 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerName="extract-utilities" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.349932 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerName="extract-utilities" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.349939 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerName="extract-content" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.349945 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerName="extract-content" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.349953 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerName="extract-content" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.349959 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerName="extract-content" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.349968 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a02593-b52d-470c-967d-565b6fafde45" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.349973 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a02593-b52d-470c-967d-565b6fafde45" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.349980 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerName="extract-utilities" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.349985 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerName="extract-utilities" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.349994 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a02593-b52d-470c-967d-565b6fafde45" containerName="extract-content" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350001 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a02593-b52d-470c-967d-565b6fafde45" containerName="extract-content" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.350009 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" containerName="marketplace-operator" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350015 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" containerName="marketplace-operator" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.350021 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" containerName="marketplace-operator" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350027 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" containerName="marketplace-operator" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.350034 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerName="extract-utilities" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350040 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerName="extract-utilities" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.350047 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerName="extract-content" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350052 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerName="extract-content" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.350062 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a02593-b52d-470c-967d-565b6fafde45" containerName="extract-utilities" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350068 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a02593-b52d-470c-967d-565b6fafde45" containerName="extract-utilities" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.350078 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350083 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 00:14:48 crc kubenswrapper[4870]: E0312 00:14:48.350091 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350097 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350194 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" containerName="marketplace-operator" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350205 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="596347fa-d520-46af-b25c-860d7c0d91a4" containerName="marketplace-operator" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350214 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="633cb50d-ccf5-4e3c-a40f-05581c94950e" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350222 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350232 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b05c20-2025-4ce8-9c10-a31f3e0b20e1" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350238 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a02593-b52d-470c-967d-565b6fafde45" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350247 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8b915a-17ad-4b09-812f-dea6471a117c" containerName="registry-server" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.350597 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.356740 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554574-m2sqq"] Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.358083 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554574-m2sqq" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.367328 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.367425 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.367683 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9fvj8" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.367820 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.367972 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.369490 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.383504 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554574-m2sqq"] Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.385338 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.386235 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.402292 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-72wh5"] Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.449759 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a0f4d065-12c4-4d6c-aa9e-56560911ed54-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-72wh5\" (UID: \"a0f4d065-12c4-4d6c-aa9e-56560911ed54\") " pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.449823 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gpcw\" (UniqueName: \"kubernetes.io/projected/a0f4d065-12c4-4d6c-aa9e-56560911ed54-kube-api-access-2gpcw\") pod \"marketplace-operator-79b997595-72wh5\" (UID: \"a0f4d065-12c4-4d6c-aa9e-56560911ed54\") " pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.449900 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxjhs\" (UniqueName: \"kubernetes.io/projected/8f600df0-7365-49d2-ba00-9747953def68-kube-api-access-kxjhs\") pod \"auto-csr-approver-29554574-m2sqq\" (UID: \"8f600df0-7365-49d2-ba00-9747953def68\") " pod="openshift-infra/auto-csr-approver-29554574-m2sqq" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.449918 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0f4d065-12c4-4d6c-aa9e-56560911ed54-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-72wh5\" (UID: \"a0f4d065-12c4-4d6c-aa9e-56560911ed54\") " pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.551489 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gpcw\" (UniqueName: \"kubernetes.io/projected/a0f4d065-12c4-4d6c-aa9e-56560911ed54-kube-api-access-2gpcw\") pod \"marketplace-operator-79b997595-72wh5\" (UID: \"a0f4d065-12c4-4d6c-aa9e-56560911ed54\") " pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.551576 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxjhs\" (UniqueName: \"kubernetes.io/projected/8f600df0-7365-49d2-ba00-9747953def68-kube-api-access-kxjhs\") pod \"auto-csr-approver-29554574-m2sqq\" (UID: \"8f600df0-7365-49d2-ba00-9747953def68\") " pod="openshift-infra/auto-csr-approver-29554574-m2sqq" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.551594 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0f4d065-12c4-4d6c-aa9e-56560911ed54-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-72wh5\" (UID: \"a0f4d065-12c4-4d6c-aa9e-56560911ed54\") " pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.551626 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a0f4d065-12c4-4d6c-aa9e-56560911ed54-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-72wh5\" (UID: \"a0f4d065-12c4-4d6c-aa9e-56560911ed54\") " pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.552804 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0f4d065-12c4-4d6c-aa9e-56560911ed54-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-72wh5\" (UID: \"a0f4d065-12c4-4d6c-aa9e-56560911ed54\") " pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.558824 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a0f4d065-12c4-4d6c-aa9e-56560911ed54-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-72wh5\" (UID: \"a0f4d065-12c4-4d6c-aa9e-56560911ed54\") " pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.578751 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gpcw\" (UniqueName: \"kubernetes.io/projected/a0f4d065-12c4-4d6c-aa9e-56560911ed54-kube-api-access-2gpcw\") pod \"marketplace-operator-79b997595-72wh5\" (UID: \"a0f4d065-12c4-4d6c-aa9e-56560911ed54\") " pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.585171 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxjhs\" (UniqueName: \"kubernetes.io/projected/8f600df0-7365-49d2-ba00-9747953def68-kube-api-access-kxjhs\") pod \"auto-csr-approver-29554574-m2sqq\" (UID: \"8f600df0-7365-49d2-ba00-9747953def68\") " pod="openshift-infra/auto-csr-approver-29554574-m2sqq" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.691489 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:48 crc kubenswrapper[4870]: I0312 00:14:48.705920 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554574-m2sqq" Mar 12 00:14:49 crc kubenswrapper[4870]: I0312 00:14:49.145790 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-72wh5"] Mar 12 00:14:49 crc kubenswrapper[4870]: I0312 00:14:49.196642 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554574-m2sqq"] Mar 12 00:14:49 crc kubenswrapper[4870]: W0312 00:14:49.203533 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f600df0_7365_49d2_ba00_9747953def68.slice/crio-51c6b8cf2d9c219eb3bd0b83b4db1a33367879f0ca51ac55f734e384832c30c9 WatchSource:0}: Error finding container 51c6b8cf2d9c219eb3bd0b83b4db1a33367879f0ca51ac55f734e384832c30c9: Status 404 returned error can't find the container with id 51c6b8cf2d9c219eb3bd0b83b4db1a33367879f0ca51ac55f734e384832c30c9 Mar 12 00:14:49 crc kubenswrapper[4870]: I0312 00:14:49.458489 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554574-m2sqq" event={"ID":"8f600df0-7365-49d2-ba00-9747953def68","Type":"ContainerStarted","Data":"51c6b8cf2d9c219eb3bd0b83b4db1a33367879f0ca51ac55f734e384832c30c9"} Mar 12 00:14:49 crc kubenswrapper[4870]: I0312 00:14:49.460705 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" event={"ID":"a0f4d065-12c4-4d6c-aa9e-56560911ed54","Type":"ContainerStarted","Data":"7edc6e0ec1b44ebd9333cddd95f7cebc03a10c626ce5ce7a2f54610390f8d6db"} Mar 12 00:14:49 crc kubenswrapper[4870]: I0312 00:14:49.460751 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" event={"ID":"a0f4d065-12c4-4d6c-aa9e-56560911ed54","Type":"ContainerStarted","Data":"644f11a7529ca1d67d6c9772719deee5aab2a4a8304ac5340e2b58f33e702b67"} Mar 12 00:14:49 crc kubenswrapper[4870]: I0312 00:14:49.461034 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:49 crc kubenswrapper[4870]: I0312 00:14:49.462838 4870 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-72wh5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 12 00:14:49 crc kubenswrapper[4870]: I0312 00:14:49.462920 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" podUID="a0f4d065-12c4-4d6c-aa9e-56560911ed54" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 12 00:14:49 crc kubenswrapper[4870]: I0312 00:14:49.480610 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" podStartSLOduration=1.480590413 podStartE2EDuration="1.480590413s" podCreationTimestamp="2026-03-12 00:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:14:49.479089431 +0000 UTC m=+380.082505731" watchObservedRunningTime="2026-03-12 00:14:49.480590413 +0000 UTC m=+380.084006723" Mar 12 00:14:50 crc kubenswrapper[4870]: I0312 00:14:50.470846 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-72wh5" Mar 12 00:14:51 crc kubenswrapper[4870]: I0312 00:14:51.476368 4870 generic.go:334] "Generic (PLEG): container finished" podID="8f600df0-7365-49d2-ba00-9747953def68" containerID="f75292cc45e88bcd878dab5d9e8ab5a87bd7913266e53492c95c49e82186f0cd" exitCode=0 Mar 12 00:14:51 crc kubenswrapper[4870]: I0312 00:14:51.476450 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554574-m2sqq" event={"ID":"8f600df0-7365-49d2-ba00-9747953def68","Type":"ContainerDied","Data":"f75292cc45e88bcd878dab5d9e8ab5a87bd7913266e53492c95c49e82186f0cd"} Mar 12 00:14:52 crc kubenswrapper[4870]: I0312 00:14:52.890738 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554574-m2sqq" Mar 12 00:14:53 crc kubenswrapper[4870]: I0312 00:14:53.014325 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxjhs\" (UniqueName: \"kubernetes.io/projected/8f600df0-7365-49d2-ba00-9747953def68-kube-api-access-kxjhs\") pod \"8f600df0-7365-49d2-ba00-9747953def68\" (UID: \"8f600df0-7365-49d2-ba00-9747953def68\") " Mar 12 00:14:53 crc kubenswrapper[4870]: I0312 00:14:53.023332 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f600df0-7365-49d2-ba00-9747953def68-kube-api-access-kxjhs" (OuterVolumeSpecName: "kube-api-access-kxjhs") pod "8f600df0-7365-49d2-ba00-9747953def68" (UID: "8f600df0-7365-49d2-ba00-9747953def68"). InnerVolumeSpecName "kube-api-access-kxjhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:14:53 crc kubenswrapper[4870]: I0312 00:14:53.116253 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxjhs\" (UniqueName: \"kubernetes.io/projected/8f600df0-7365-49d2-ba00-9747953def68-kube-api-access-kxjhs\") on node \"crc\" DevicePath \"\"" Mar 12 00:14:53 crc kubenswrapper[4870]: I0312 00:14:53.492010 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554574-m2sqq" event={"ID":"8f600df0-7365-49d2-ba00-9747953def68","Type":"ContainerDied","Data":"51c6b8cf2d9c219eb3bd0b83b4db1a33367879f0ca51ac55f734e384832c30c9"} Mar 12 00:14:53 crc kubenswrapper[4870]: I0312 00:14:53.492426 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51c6b8cf2d9c219eb3bd0b83b4db1a33367879f0ca51ac55f734e384832c30c9" Mar 12 00:14:53 crc kubenswrapper[4870]: I0312 00:14:53.492126 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554574-m2sqq" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.157491 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n"] Mar 12 00:15:00 crc kubenswrapper[4870]: E0312 00:15:00.158062 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f600df0-7365-49d2-ba00-9747953def68" containerName="oc" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.158077 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f600df0-7365-49d2-ba00-9747953def68" containerName="oc" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.158214 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f600df0-7365-49d2-ba00-9747953def68" containerName="oc" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.158640 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.161326 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.161927 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.172390 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n"] Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.236410 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9aef54f-62b6-435d-b94f-a0468e146e9e-secret-volume\") pod \"collect-profiles-29554575-r2v9n\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.236493 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdfjg\" (UniqueName: \"kubernetes.io/projected/a9aef54f-62b6-435d-b94f-a0468e146e9e-kube-api-access-xdfjg\") pod \"collect-profiles-29554575-r2v9n\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.236551 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9aef54f-62b6-435d-b94f-a0468e146e9e-config-volume\") pod \"collect-profiles-29554575-r2v9n\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.337016 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9aef54f-62b6-435d-b94f-a0468e146e9e-config-volume\") pod \"collect-profiles-29554575-r2v9n\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.337081 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9aef54f-62b6-435d-b94f-a0468e146e9e-secret-volume\") pod \"collect-profiles-29554575-r2v9n\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.337114 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdfjg\" (UniqueName: \"kubernetes.io/projected/a9aef54f-62b6-435d-b94f-a0468e146e9e-kube-api-access-xdfjg\") pod \"collect-profiles-29554575-r2v9n\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.338199 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9aef54f-62b6-435d-b94f-a0468e146e9e-config-volume\") pod \"collect-profiles-29554575-r2v9n\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.356005 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9aef54f-62b6-435d-b94f-a0468e146e9e-secret-volume\") pod \"collect-profiles-29554575-r2v9n\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.356546 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdfjg\" (UniqueName: \"kubernetes.io/projected/a9aef54f-62b6-435d-b94f-a0468e146e9e-kube-api-access-xdfjg\") pod \"collect-profiles-29554575-r2v9n\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.484638 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:00 crc kubenswrapper[4870]: I0312 00:15:00.904705 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n"] Mar 12 00:15:00 crc kubenswrapper[4870]: W0312 00:15:00.911551 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9aef54f_62b6_435d_b94f_a0468e146e9e.slice/crio-0601ececc76dea67eebe967818f14ed9fc8727e35d390c6d64db77aafd7ccc8f WatchSource:0}: Error finding container 0601ececc76dea67eebe967818f14ed9fc8727e35d390c6d64db77aafd7ccc8f: Status 404 returned error can't find the container with id 0601ececc76dea67eebe967818f14ed9fc8727e35d390c6d64db77aafd7ccc8f Mar 12 00:15:01 crc kubenswrapper[4870]: I0312 00:15:01.550729 4870 generic.go:334] "Generic (PLEG): container finished" podID="a9aef54f-62b6-435d-b94f-a0468e146e9e" containerID="733fc6ee360cadea3373add23af20edf27a8a335e4336979e02f942aa3e9f810" exitCode=0 Mar 12 00:15:01 crc kubenswrapper[4870]: I0312 00:15:01.550795 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" event={"ID":"a9aef54f-62b6-435d-b94f-a0468e146e9e","Type":"ContainerDied","Data":"733fc6ee360cadea3373add23af20edf27a8a335e4336979e02f942aa3e9f810"} Mar 12 00:15:01 crc kubenswrapper[4870]: I0312 00:15:01.551040 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" event={"ID":"a9aef54f-62b6-435d-b94f-a0468e146e9e","Type":"ContainerStarted","Data":"0601ececc76dea67eebe967818f14ed9fc8727e35d390c6d64db77aafd7ccc8f"} Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:02.852618 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:02.972298 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdfjg\" (UniqueName: \"kubernetes.io/projected/a9aef54f-62b6-435d-b94f-a0468e146e9e-kube-api-access-xdfjg\") pod \"a9aef54f-62b6-435d-b94f-a0468e146e9e\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:02.972357 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9aef54f-62b6-435d-b94f-a0468e146e9e-config-volume\") pod \"a9aef54f-62b6-435d-b94f-a0468e146e9e\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:02.972399 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9aef54f-62b6-435d-b94f-a0468e146e9e-secret-volume\") pod \"a9aef54f-62b6-435d-b94f-a0468e146e9e\" (UID: \"a9aef54f-62b6-435d-b94f-a0468e146e9e\") " Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:02.973160 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9aef54f-62b6-435d-b94f-a0468e146e9e-config-volume" (OuterVolumeSpecName: "config-volume") pod "a9aef54f-62b6-435d-b94f-a0468e146e9e" (UID: "a9aef54f-62b6-435d-b94f-a0468e146e9e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:02.977859 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9aef54f-62b6-435d-b94f-a0468e146e9e-kube-api-access-xdfjg" (OuterVolumeSpecName: "kube-api-access-xdfjg") pod "a9aef54f-62b6-435d-b94f-a0468e146e9e" (UID: "a9aef54f-62b6-435d-b94f-a0468e146e9e"). InnerVolumeSpecName "kube-api-access-xdfjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:02.978062 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9aef54f-62b6-435d-b94f-a0468e146e9e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a9aef54f-62b6-435d-b94f-a0468e146e9e" (UID: "a9aef54f-62b6-435d-b94f-a0468e146e9e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:03.073229 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdfjg\" (UniqueName: \"kubernetes.io/projected/a9aef54f-62b6-435d-b94f-a0468e146e9e-kube-api-access-xdfjg\") on node \"crc\" DevicePath \"\"" Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:03.073270 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9aef54f-62b6-435d-b94f-a0468e146e9e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:03.073280 4870 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9aef54f-62b6-435d-b94f-a0468e146e9e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:03.563329 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" event={"ID":"a9aef54f-62b6-435d-b94f-a0468e146e9e","Type":"ContainerDied","Data":"0601ececc76dea67eebe967818f14ed9fc8727e35d390c6d64db77aafd7ccc8f"} Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:03.563583 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0601ececc76dea67eebe967818f14ed9fc8727e35d390c6d64db77aafd7ccc8f" Mar 12 00:15:03 crc kubenswrapper[4870]: I0312 00:15:03.563398 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554575-r2v9n" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.278890 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lj5nj"] Mar 12 00:15:09 crc kubenswrapper[4870]: E0312 00:15:09.279667 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9aef54f-62b6-435d-b94f-a0468e146e9e" containerName="collect-profiles" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.279681 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9aef54f-62b6-435d-b94f-a0468e146e9e" containerName="collect-profiles" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.279827 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9aef54f-62b6-435d-b94f-a0468e146e9e" containerName="collect-profiles" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.280268 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.302723 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lj5nj"] Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.450291 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.450336 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-registry-certificates\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.450358 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk6q7\" (UniqueName: \"kubernetes.io/projected/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-kube-api-access-hk6q7\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.450391 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.450464 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.450494 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-bound-sa-token\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.450522 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-trusted-ca\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.450539 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-registry-tls\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.469190 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.551158 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-trusted-ca\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.551202 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-registry-tls\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.551232 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.551255 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-registry-certificates\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.551273 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk6q7\" (UniqueName: \"kubernetes.io/projected/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-kube-api-access-hk6q7\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.551310 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.551332 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-bound-sa-token\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.552344 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.552600 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-registry-certificates\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.552612 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-trusted-ca\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.559412 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-registry-tls\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.560095 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.572625 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk6q7\" (UniqueName: \"kubernetes.io/projected/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-kube-api-access-hk6q7\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.577955 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f861acc-8343-4460-9ec5-c3b7f35ee8cf-bound-sa-token\") pod \"image-registry-66df7c8f76-lj5nj\" (UID: \"8f861acc-8343-4460-9ec5-c3b7f35ee8cf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:09 crc kubenswrapper[4870]: I0312 00:15:09.629692 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:10 crc kubenswrapper[4870]: I0312 00:15:10.045026 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lj5nj"] Mar 12 00:15:10 crc kubenswrapper[4870]: I0312 00:15:10.601304 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" event={"ID":"8f861acc-8343-4460-9ec5-c3b7f35ee8cf","Type":"ContainerStarted","Data":"2da5f3d7a4c96cd9cc23fe942587365e05f7ab3d53ffeaf5be926dc4e0c9d876"} Mar 12 00:15:10 crc kubenswrapper[4870]: I0312 00:15:10.601345 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" event={"ID":"8f861acc-8343-4460-9ec5-c3b7f35ee8cf","Type":"ContainerStarted","Data":"b50c99efad3c49856e421b7d7161b4e05f517e7e1a0ae6311a177cf5d98f6248"} Mar 12 00:15:10 crc kubenswrapper[4870]: I0312 00:15:10.601457 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:10 crc kubenswrapper[4870]: I0312 00:15:10.621079 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" podStartSLOduration=1.6210575409999999 podStartE2EDuration="1.621057541s" podCreationTimestamp="2026-03-12 00:15:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:15:10.619240028 +0000 UTC m=+401.222656368" watchObservedRunningTime="2026-03-12 00:15:10.621057541 +0000 UTC m=+401.224473851" Mar 12 00:15:17 crc kubenswrapper[4870]: I0312 00:15:17.594462 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:15:17 crc kubenswrapper[4870]: I0312 00:15:17.595129 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:15:29 crc kubenswrapper[4870]: I0312 00:15:29.634512 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lj5nj" Mar 12 00:15:29 crc kubenswrapper[4870]: I0312 00:15:29.693501 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c88kv"] Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.317024 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s5gpv"] Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.319496 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.322575 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.328891 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s5gpv"] Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.476553 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca7a7ade-abe4-463f-937d-e6c399cdf72c-catalog-content\") pod \"certified-operators-s5gpv\" (UID: \"ca7a7ade-abe4-463f-937d-e6c399cdf72c\") " pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.476627 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb7h2\" (UniqueName: \"kubernetes.io/projected/ca7a7ade-abe4-463f-937d-e6c399cdf72c-kube-api-access-zb7h2\") pod \"certified-operators-s5gpv\" (UID: \"ca7a7ade-abe4-463f-937d-e6c399cdf72c\") " pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.476685 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca7a7ade-abe4-463f-937d-e6c399cdf72c-utilities\") pod \"certified-operators-s5gpv\" (UID: \"ca7a7ade-abe4-463f-937d-e6c399cdf72c\") " pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.518304 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k6pq4"] Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.519505 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.524197 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.538106 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6pq4"] Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.577443 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca7a7ade-abe4-463f-937d-e6c399cdf72c-utilities\") pod \"certified-operators-s5gpv\" (UID: \"ca7a7ade-abe4-463f-937d-e6c399cdf72c\") " pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.577509 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca7a7ade-abe4-463f-937d-e6c399cdf72c-catalog-content\") pod \"certified-operators-s5gpv\" (UID: \"ca7a7ade-abe4-463f-937d-e6c399cdf72c\") " pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.577539 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb7h2\" (UniqueName: \"kubernetes.io/projected/ca7a7ade-abe4-463f-937d-e6c399cdf72c-kube-api-access-zb7h2\") pod \"certified-operators-s5gpv\" (UID: \"ca7a7ade-abe4-463f-937d-e6c399cdf72c\") " pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.577991 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca7a7ade-abe4-463f-937d-e6c399cdf72c-catalog-content\") pod \"certified-operators-s5gpv\" (UID: \"ca7a7ade-abe4-463f-937d-e6c399cdf72c\") " pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.578073 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca7a7ade-abe4-463f-937d-e6c399cdf72c-utilities\") pod \"certified-operators-s5gpv\" (UID: \"ca7a7ade-abe4-463f-937d-e6c399cdf72c\") " pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.596593 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb7h2\" (UniqueName: \"kubernetes.io/projected/ca7a7ade-abe4-463f-937d-e6c399cdf72c-kube-api-access-zb7h2\") pod \"certified-operators-s5gpv\" (UID: \"ca7a7ade-abe4-463f-937d-e6c399cdf72c\") " pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.678928 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f1fb5d4-41a1-4908-a685-974af39fbbc5-utilities\") pod \"community-operators-k6pq4\" (UID: \"4f1fb5d4-41a1-4908-a685-974af39fbbc5\") " pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.678975 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f1fb5d4-41a1-4908-a685-974af39fbbc5-catalog-content\") pod \"community-operators-k6pq4\" (UID: \"4f1fb5d4-41a1-4908-a685-974af39fbbc5\") " pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.678999 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkw6m\" (UniqueName: \"kubernetes.io/projected/4f1fb5d4-41a1-4908-a685-974af39fbbc5-kube-api-access-pkw6m\") pod \"community-operators-k6pq4\" (UID: \"4f1fb5d4-41a1-4908-a685-974af39fbbc5\") " pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.690037 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.779832 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f1fb5d4-41a1-4908-a685-974af39fbbc5-utilities\") pod \"community-operators-k6pq4\" (UID: \"4f1fb5d4-41a1-4908-a685-974af39fbbc5\") " pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.779902 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f1fb5d4-41a1-4908-a685-974af39fbbc5-catalog-content\") pod \"community-operators-k6pq4\" (UID: \"4f1fb5d4-41a1-4908-a685-974af39fbbc5\") " pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.779936 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkw6m\" (UniqueName: \"kubernetes.io/projected/4f1fb5d4-41a1-4908-a685-974af39fbbc5-kube-api-access-pkw6m\") pod \"community-operators-k6pq4\" (UID: \"4f1fb5d4-41a1-4908-a685-974af39fbbc5\") " pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.780874 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f1fb5d4-41a1-4908-a685-974af39fbbc5-utilities\") pod \"community-operators-k6pq4\" (UID: \"4f1fb5d4-41a1-4908-a685-974af39fbbc5\") " pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.780911 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f1fb5d4-41a1-4908-a685-974af39fbbc5-catalog-content\") pod \"community-operators-k6pq4\" (UID: \"4f1fb5d4-41a1-4908-a685-974af39fbbc5\") " pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.815983 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkw6m\" (UniqueName: \"kubernetes.io/projected/4f1fb5d4-41a1-4908-a685-974af39fbbc5-kube-api-access-pkw6m\") pod \"community-operators-k6pq4\" (UID: \"4f1fb5d4-41a1-4908-a685-974af39fbbc5\") " pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:36 crc kubenswrapper[4870]: I0312 00:15:36.843915 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:37 crc kubenswrapper[4870]: I0312 00:15:37.122736 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s5gpv"] Mar 12 00:15:37 crc kubenswrapper[4870]: I0312 00:15:37.249798 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6pq4"] Mar 12 00:15:37 crc kubenswrapper[4870]: W0312 00:15:37.261220 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f1fb5d4_41a1_4908_a685_974af39fbbc5.slice/crio-30f8626c5dfd9d2fef6d9036e9b80013a3329d449ddba61fb3bd9a370d533df4 WatchSource:0}: Error finding container 30f8626c5dfd9d2fef6d9036e9b80013a3329d449ddba61fb3bd9a370d533df4: Status 404 returned error can't find the container with id 30f8626c5dfd9d2fef6d9036e9b80013a3329d449ddba61fb3bd9a370d533df4 Mar 12 00:15:37 crc kubenswrapper[4870]: I0312 00:15:37.778024 4870 generic.go:334] "Generic (PLEG): container finished" podID="ca7a7ade-abe4-463f-937d-e6c399cdf72c" containerID="95934ed61dd46e9be26399bc14e30f42cc0def8fd0813c85faea613c11f30132" exitCode=0 Mar 12 00:15:37 crc kubenswrapper[4870]: I0312 00:15:37.778113 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5gpv" event={"ID":"ca7a7ade-abe4-463f-937d-e6c399cdf72c","Type":"ContainerDied","Data":"95934ed61dd46e9be26399bc14e30f42cc0def8fd0813c85faea613c11f30132"} Mar 12 00:15:37 crc kubenswrapper[4870]: I0312 00:15:37.780875 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5gpv" event={"ID":"ca7a7ade-abe4-463f-937d-e6c399cdf72c","Type":"ContainerStarted","Data":"ca492cb6900e8a2ea90e740f74aa00b67fa77d35da201dbe29c1b63d020644fc"} Mar 12 00:15:37 crc kubenswrapper[4870]: I0312 00:15:37.784516 4870 generic.go:334] "Generic (PLEG): container finished" podID="4f1fb5d4-41a1-4908-a685-974af39fbbc5" containerID="063d3ab029c035fa7939220400c848f1c75c49a56479b417ae487f73d88d7e32" exitCode=0 Mar 12 00:15:37 crc kubenswrapper[4870]: I0312 00:15:37.784541 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6pq4" event={"ID":"4f1fb5d4-41a1-4908-a685-974af39fbbc5","Type":"ContainerDied","Data":"063d3ab029c035fa7939220400c848f1c75c49a56479b417ae487f73d88d7e32"} Mar 12 00:15:37 crc kubenswrapper[4870]: I0312 00:15:37.784558 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6pq4" event={"ID":"4f1fb5d4-41a1-4908-a685-974af39fbbc5","Type":"ContainerStarted","Data":"30f8626c5dfd9d2fef6d9036e9b80013a3329d449ddba61fb3bd9a370d533df4"} Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.717498 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m2z2k"] Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.719633 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.722337 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.730132 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2z2k"] Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.791156 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5gpv" event={"ID":"ca7a7ade-abe4-463f-937d-e6c399cdf72c","Type":"ContainerStarted","Data":"4a911ab4c948b813ad4b639f0a714c5edbbbe3d9068f43430f9aa0248fbf9fc5"} Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.793686 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6pq4" event={"ID":"4f1fb5d4-41a1-4908-a685-974af39fbbc5","Type":"ContainerStarted","Data":"f642ecc23ea24920e63546c2ca7da824744a1d22bb3430dee30a9f6e644c18e2"} Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.809732 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-utilities\") pod \"redhat-marketplace-m2z2k\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.809800 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-catalog-content\") pod \"redhat-marketplace-m2z2k\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.810012 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgf4z\" (UniqueName: \"kubernetes.io/projected/bafff646-ca93-422c-8f5d-e0f30e852b71-kube-api-access-rgf4z\") pod \"redhat-marketplace-m2z2k\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.911702 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgf4z\" (UniqueName: \"kubernetes.io/projected/bafff646-ca93-422c-8f5d-e0f30e852b71-kube-api-access-rgf4z\") pod \"redhat-marketplace-m2z2k\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.911830 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-utilities\") pod \"redhat-marketplace-m2z2k\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.911873 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-catalog-content\") pod \"redhat-marketplace-m2z2k\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.912654 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-catalog-content\") pod \"redhat-marketplace-m2z2k\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.913589 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-utilities\") pod \"redhat-marketplace-m2z2k\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.915897 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d2sbw"] Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.917513 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.922552 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.940236 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2sbw"] Mar 12 00:15:38 crc kubenswrapper[4870]: I0312 00:15:38.953489 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgf4z\" (UniqueName: \"kubernetes.io/projected/bafff646-ca93-422c-8f5d-e0f30e852b71-kube-api-access-rgf4z\") pod \"redhat-marketplace-m2z2k\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.013587 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s79r9\" (UniqueName: \"kubernetes.io/projected/6493de17-4588-4ee6-8d01-ad464fbc01a4-kube-api-access-s79r9\") pod \"redhat-operators-d2sbw\" (UID: \"6493de17-4588-4ee6-8d01-ad464fbc01a4\") " pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.013658 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6493de17-4588-4ee6-8d01-ad464fbc01a4-utilities\") pod \"redhat-operators-d2sbw\" (UID: \"6493de17-4588-4ee6-8d01-ad464fbc01a4\") " pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.013717 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6493de17-4588-4ee6-8d01-ad464fbc01a4-catalog-content\") pod \"redhat-operators-d2sbw\" (UID: \"6493de17-4588-4ee6-8d01-ad464fbc01a4\") " pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.090471 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.114501 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6493de17-4588-4ee6-8d01-ad464fbc01a4-catalog-content\") pod \"redhat-operators-d2sbw\" (UID: \"6493de17-4588-4ee6-8d01-ad464fbc01a4\") " pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.114578 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s79r9\" (UniqueName: \"kubernetes.io/projected/6493de17-4588-4ee6-8d01-ad464fbc01a4-kube-api-access-s79r9\") pod \"redhat-operators-d2sbw\" (UID: \"6493de17-4588-4ee6-8d01-ad464fbc01a4\") " pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.114619 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6493de17-4588-4ee6-8d01-ad464fbc01a4-utilities\") pod \"redhat-operators-d2sbw\" (UID: \"6493de17-4588-4ee6-8d01-ad464fbc01a4\") " pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.115080 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6493de17-4588-4ee6-8d01-ad464fbc01a4-utilities\") pod \"redhat-operators-d2sbw\" (UID: \"6493de17-4588-4ee6-8d01-ad464fbc01a4\") " pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.115172 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6493de17-4588-4ee6-8d01-ad464fbc01a4-catalog-content\") pod \"redhat-operators-d2sbw\" (UID: \"6493de17-4588-4ee6-8d01-ad464fbc01a4\") " pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.146850 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s79r9\" (UniqueName: \"kubernetes.io/projected/6493de17-4588-4ee6-8d01-ad464fbc01a4-kube-api-access-s79r9\") pod \"redhat-operators-d2sbw\" (UID: \"6493de17-4588-4ee6-8d01-ad464fbc01a4\") " pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.234481 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.355255 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2z2k"] Mar 12 00:15:39 crc kubenswrapper[4870]: W0312 00:15:39.369184 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbafff646_ca93_422c_8f5d_e0f30e852b71.slice/crio-3cf6d4864b6a635ab951e83ded94a818f4c111cb5fcba861f9eff49967b88cbe WatchSource:0}: Error finding container 3cf6d4864b6a635ab951e83ded94a818f4c111cb5fcba861f9eff49967b88cbe: Status 404 returned error can't find the container with id 3cf6d4864b6a635ab951e83ded94a818f4c111cb5fcba861f9eff49967b88cbe Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.496269 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2sbw"] Mar 12 00:15:39 crc kubenswrapper[4870]: W0312 00:15:39.498651 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6493de17_4588_4ee6_8d01_ad464fbc01a4.slice/crio-2cef0be6239b7e954ab8ae33101368713d650077971eedbbe5d8ac5565f795f3 WatchSource:0}: Error finding container 2cef0be6239b7e954ab8ae33101368713d650077971eedbbe5d8ac5565f795f3: Status 404 returned error can't find the container with id 2cef0be6239b7e954ab8ae33101368713d650077971eedbbe5d8ac5565f795f3 Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.801886 4870 generic.go:334] "Generic (PLEG): container finished" podID="6493de17-4588-4ee6-8d01-ad464fbc01a4" containerID="094584c822d8183e7aa13e95eec64d6ab60e83e4e60195a6a73f29c5c91d014a" exitCode=0 Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.801955 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2sbw" event={"ID":"6493de17-4588-4ee6-8d01-ad464fbc01a4","Type":"ContainerDied","Data":"094584c822d8183e7aa13e95eec64d6ab60e83e4e60195a6a73f29c5c91d014a"} Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.802024 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2sbw" event={"ID":"6493de17-4588-4ee6-8d01-ad464fbc01a4","Type":"ContainerStarted","Data":"2cef0be6239b7e954ab8ae33101368713d650077971eedbbe5d8ac5565f795f3"} Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.808065 4870 generic.go:334] "Generic (PLEG): container finished" podID="4f1fb5d4-41a1-4908-a685-974af39fbbc5" containerID="f642ecc23ea24920e63546c2ca7da824744a1d22bb3430dee30a9f6e644c18e2" exitCode=0 Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.808127 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6pq4" event={"ID":"4f1fb5d4-41a1-4908-a685-974af39fbbc5","Type":"ContainerDied","Data":"f642ecc23ea24920e63546c2ca7da824744a1d22bb3430dee30a9f6e644c18e2"} Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.816987 4870 generic.go:334] "Generic (PLEG): container finished" podID="ca7a7ade-abe4-463f-937d-e6c399cdf72c" containerID="4a911ab4c948b813ad4b639f0a714c5edbbbe3d9068f43430f9aa0248fbf9fc5" exitCode=0 Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.817086 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5gpv" event={"ID":"ca7a7ade-abe4-463f-937d-e6c399cdf72c","Type":"ContainerDied","Data":"4a911ab4c948b813ad4b639f0a714c5edbbbe3d9068f43430f9aa0248fbf9fc5"} Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.820819 4870 generic.go:334] "Generic (PLEG): container finished" podID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerID="023df00116809cf17a0e9a819b147a660cb3e668f5788ec0e8c987ce1a7a2535" exitCode=0 Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.820874 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2z2k" event={"ID":"bafff646-ca93-422c-8f5d-e0f30e852b71","Type":"ContainerDied","Data":"023df00116809cf17a0e9a819b147a660cb3e668f5788ec0e8c987ce1a7a2535"} Mar 12 00:15:39 crc kubenswrapper[4870]: I0312 00:15:39.820912 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2z2k" event={"ID":"bafff646-ca93-422c-8f5d-e0f30e852b71","Type":"ContainerStarted","Data":"3cf6d4864b6a635ab951e83ded94a818f4c111cb5fcba861f9eff49967b88cbe"} Mar 12 00:15:40 crc kubenswrapper[4870]: I0312 00:15:40.827963 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5gpv" event={"ID":"ca7a7ade-abe4-463f-937d-e6c399cdf72c","Type":"ContainerStarted","Data":"49fa407404bd30fcd593ada0002f1cb5ba32f2c2e6b06856628d43c9091b78e1"} Mar 12 00:15:40 crc kubenswrapper[4870]: I0312 00:15:40.829707 4870 generic.go:334] "Generic (PLEG): container finished" podID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerID="ef4d7369d6e64b07e704a9642a17d77787421d298dc420a5053f8298bf5f9e96" exitCode=0 Mar 12 00:15:40 crc kubenswrapper[4870]: I0312 00:15:40.829777 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2z2k" event={"ID":"bafff646-ca93-422c-8f5d-e0f30e852b71","Type":"ContainerDied","Data":"ef4d7369d6e64b07e704a9642a17d77787421d298dc420a5053f8298bf5f9e96"} Mar 12 00:15:40 crc kubenswrapper[4870]: I0312 00:15:40.831935 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2sbw" event={"ID":"6493de17-4588-4ee6-8d01-ad464fbc01a4","Type":"ContainerStarted","Data":"3272c129c92e35680205fc22e543fe658b738f8795ebec4e310d426cf175fb8a"} Mar 12 00:15:40 crc kubenswrapper[4870]: I0312 00:15:40.833794 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6pq4" event={"ID":"4f1fb5d4-41a1-4908-a685-974af39fbbc5","Type":"ContainerStarted","Data":"e58346121fa1f5b159f779e50d0c9cd4bfda571723fcb7b5181096180d7d95fc"} Mar 12 00:15:40 crc kubenswrapper[4870]: I0312 00:15:40.855135 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s5gpv" podStartSLOduration=2.352307519 podStartE2EDuration="4.855117575s" podCreationTimestamp="2026-03-12 00:15:36 +0000 UTC" firstStartedPulling="2026-03-12 00:15:37.779521633 +0000 UTC m=+428.382937943" lastFinishedPulling="2026-03-12 00:15:40.282331669 +0000 UTC m=+430.885747999" observedRunningTime="2026-03-12 00:15:40.853859528 +0000 UTC m=+431.457275878" watchObservedRunningTime="2026-03-12 00:15:40.855117575 +0000 UTC m=+431.458533885" Mar 12 00:15:40 crc kubenswrapper[4870]: I0312 00:15:40.895939 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k6pq4" podStartSLOduration=2.479556278 podStartE2EDuration="4.89590592s" podCreationTimestamp="2026-03-12 00:15:36 +0000 UTC" firstStartedPulling="2026-03-12 00:15:37.786129487 +0000 UTC m=+428.389545797" lastFinishedPulling="2026-03-12 00:15:40.202479089 +0000 UTC m=+430.805895439" observedRunningTime="2026-03-12 00:15:40.888020209 +0000 UTC m=+431.491436519" watchObservedRunningTime="2026-03-12 00:15:40.89590592 +0000 UTC m=+431.499322270" Mar 12 00:15:41 crc kubenswrapper[4870]: I0312 00:15:41.840625 4870 generic.go:334] "Generic (PLEG): container finished" podID="6493de17-4588-4ee6-8d01-ad464fbc01a4" containerID="3272c129c92e35680205fc22e543fe658b738f8795ebec4e310d426cf175fb8a" exitCode=0 Mar 12 00:15:41 crc kubenswrapper[4870]: I0312 00:15:41.840727 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2sbw" event={"ID":"6493de17-4588-4ee6-8d01-ad464fbc01a4","Type":"ContainerDied","Data":"3272c129c92e35680205fc22e543fe658b738f8795ebec4e310d426cf175fb8a"} Mar 12 00:15:41 crc kubenswrapper[4870]: I0312 00:15:41.842471 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2z2k" event={"ID":"bafff646-ca93-422c-8f5d-e0f30e852b71","Type":"ContainerStarted","Data":"6c4b0a5328ece4a44202d6b92ad1e0aa504d8dceaae5d7b9d22d21ee84ab386c"} Mar 12 00:15:41 crc kubenswrapper[4870]: I0312 00:15:41.879037 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m2z2k" podStartSLOduration=2.47346451 podStartE2EDuration="3.879023051s" podCreationTimestamp="2026-03-12 00:15:38 +0000 UTC" firstStartedPulling="2026-03-12 00:15:39.822930216 +0000 UTC m=+430.426346566" lastFinishedPulling="2026-03-12 00:15:41.228488797 +0000 UTC m=+431.831905107" observedRunningTime="2026-03-12 00:15:41.877915229 +0000 UTC m=+432.481331539" watchObservedRunningTime="2026-03-12 00:15:41.879023051 +0000 UTC m=+432.482439361" Mar 12 00:15:42 crc kubenswrapper[4870]: I0312 00:15:42.850182 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2sbw" event={"ID":"6493de17-4588-4ee6-8d01-ad464fbc01a4","Type":"ContainerStarted","Data":"779316359f4527d7adcfdcdbdb975524252a9422c44618a810ac97c4d553b41d"} Mar 12 00:15:42 crc kubenswrapper[4870]: I0312 00:15:42.871492 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d2sbw" podStartSLOduration=2.406648183 podStartE2EDuration="4.871473656s" podCreationTimestamp="2026-03-12 00:15:38 +0000 UTC" firstStartedPulling="2026-03-12 00:15:39.806036771 +0000 UTC m=+430.409453111" lastFinishedPulling="2026-03-12 00:15:42.270862264 +0000 UTC m=+432.874278584" observedRunningTime="2026-03-12 00:15:42.871086675 +0000 UTC m=+433.474502985" watchObservedRunningTime="2026-03-12 00:15:42.871473656 +0000 UTC m=+433.474889966" Mar 12 00:15:46 crc kubenswrapper[4870]: I0312 00:15:46.690530 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:46 crc kubenswrapper[4870]: I0312 00:15:46.691057 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:46 crc kubenswrapper[4870]: I0312 00:15:46.757889 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:46 crc kubenswrapper[4870]: I0312 00:15:46.845504 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:46 crc kubenswrapper[4870]: I0312 00:15:46.845567 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:46 crc kubenswrapper[4870]: I0312 00:15:46.884658 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:46 crc kubenswrapper[4870]: I0312 00:15:46.916079 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s5gpv" Mar 12 00:15:46 crc kubenswrapper[4870]: I0312 00:15:46.935612 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k6pq4" Mar 12 00:15:47 crc kubenswrapper[4870]: I0312 00:15:47.595000 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:15:47 crc kubenswrapper[4870]: I0312 00:15:47.595087 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:15:47 crc kubenswrapper[4870]: I0312 00:15:47.595176 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:15:47 crc kubenswrapper[4870]: I0312 00:15:47.596013 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8c3058facf9a2b52988623f4f9078fda5941f091e6fa03732a464860e4b1dac"} pod="openshift-machine-config-operator/machine-config-daemon-84dfr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 00:15:47 crc kubenswrapper[4870]: I0312 00:15:47.596098 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" containerID="cri-o://d8c3058facf9a2b52988623f4f9078fda5941f091e6fa03732a464860e4b1dac" gracePeriod=600 Mar 12 00:15:49 crc kubenswrapper[4870]: I0312 00:15:49.093513 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:49 crc kubenswrapper[4870]: I0312 00:15:49.095650 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:49 crc kubenswrapper[4870]: I0312 00:15:49.137940 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:49 crc kubenswrapper[4870]: I0312 00:15:49.236032 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:49 crc kubenswrapper[4870]: I0312 00:15:49.237122 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:49 crc kubenswrapper[4870]: I0312 00:15:49.903890 4870 generic.go:334] "Generic (PLEG): container finished" podID="988c0290-1e98-46c8-8253-a4718914b9ef" containerID="d8c3058facf9a2b52988623f4f9078fda5941f091e6fa03732a464860e4b1dac" exitCode=0 Mar 12 00:15:49 crc kubenswrapper[4870]: I0312 00:15:49.904275 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerDied","Data":"d8c3058facf9a2b52988623f4f9078fda5941f091e6fa03732a464860e4b1dac"} Mar 12 00:15:49 crc kubenswrapper[4870]: I0312 00:15:49.904352 4870 scope.go:117] "RemoveContainer" containerID="9de1b2b29f0ee2c9171c7b19fcd944b46b255d00ba17a4db07e37085fd3fc4ff" Mar 12 00:15:49 crc kubenswrapper[4870]: I0312 00:15:49.951968 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:15:50 crc kubenswrapper[4870]: I0312 00:15:50.283376 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d2sbw" podUID="6493de17-4588-4ee6-8d01-ad464fbc01a4" containerName="registry-server" probeResult="failure" output=< Mar 12 00:15:50 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Mar 12 00:15:50 crc kubenswrapper[4870]: > Mar 12 00:15:50 crc kubenswrapper[4870]: I0312 00:15:50.912599 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerStarted","Data":"1741f7c30d6275bdbc591187e9d7f1701084fc4106be15d405493726cd83c068"} Mar 12 00:15:54 crc kubenswrapper[4870]: I0312 00:15:54.745242 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" podUID="18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" containerName="registry" containerID="cri-o://7fe106341923b90330a49acbf89a3052f930b3f0a5de8e6563bfe0507afd2851" gracePeriod=30 Mar 12 00:15:54 crc kubenswrapper[4870]: I0312 00:15:54.943186 4870 generic.go:334] "Generic (PLEG): container finished" podID="18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" containerID="7fe106341923b90330a49acbf89a3052f930b3f0a5de8e6563bfe0507afd2851" exitCode=0 Mar 12 00:15:54 crc kubenswrapper[4870]: I0312 00:15:54.943306 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" event={"ID":"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a","Type":"ContainerDied","Data":"7fe106341923b90330a49acbf89a3052f930b3f0a5de8e6563bfe0507afd2851"} Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.239095 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.385697 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-bound-sa-token\") pod \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.385796 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-tls\") pod \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.385901 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-certificates\") pod \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.386172 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.386235 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prqs5\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-kube-api-access-prqs5\") pod \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.386311 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-trusted-ca\") pod \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.386359 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-installation-pull-secrets\") pod \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.386442 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-ca-trust-extracted\") pod \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\" (UID: \"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a\") " Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.388199 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.388334 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.399076 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-kube-api-access-prqs5" (OuterVolumeSpecName: "kube-api-access-prqs5") pod "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a"). InnerVolumeSpecName "kube-api-access-prqs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.399813 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.400129 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.400695 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.407220 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.430119 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" (UID: "18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.487775 4870 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.487827 4870 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.487849 4870 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.487867 4870 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.487890 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prqs5\" (UniqueName: \"kubernetes.io/projected/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-kube-api-access-prqs5\") on node \"crc\" DevicePath \"\"" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.487910 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.487928 4870 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.953642 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" event={"ID":"18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a","Type":"ContainerDied","Data":"2c9771370db82e4981e59ecc649924dbd0fa35eaf63836bf90b8109030238397"} Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.953687 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-c88kv" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.953698 4870 scope.go:117] "RemoveContainer" containerID="7fe106341923b90330a49acbf89a3052f930b3f0a5de8e6563bfe0507afd2851" Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.986807 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c88kv"] Mar 12 00:15:55 crc kubenswrapper[4870]: I0312 00:15:55.989383 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c88kv"] Mar 12 00:15:56 crc kubenswrapper[4870]: I0312 00:15:56.111766 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" path="/var/lib/kubelet/pods/18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a/volumes" Mar 12 00:15:59 crc kubenswrapper[4870]: I0312 00:15:59.275930 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:15:59 crc kubenswrapper[4870]: I0312 00:15:59.317789 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d2sbw" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.140646 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554576-mrf2j"] Mar 12 00:16:00 crc kubenswrapper[4870]: E0312 00:16:00.141158 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" containerName="registry" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.141172 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" containerName="registry" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.141300 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b4fa2c-97f8-4de2-8d2c-d3fee5338e1a" containerName="registry" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.141640 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554576-mrf2j" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.145093 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9fvj8" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.146453 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.148824 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554576-mrf2j"] Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.149603 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.253358 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktkm8\" (UniqueName: \"kubernetes.io/projected/aa0258ba-0167-4413-8de7-5b01a8faec96-kube-api-access-ktkm8\") pod \"auto-csr-approver-29554576-mrf2j\" (UID: \"aa0258ba-0167-4413-8de7-5b01a8faec96\") " pod="openshift-infra/auto-csr-approver-29554576-mrf2j" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.355010 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktkm8\" (UniqueName: \"kubernetes.io/projected/aa0258ba-0167-4413-8de7-5b01a8faec96-kube-api-access-ktkm8\") pod \"auto-csr-approver-29554576-mrf2j\" (UID: \"aa0258ba-0167-4413-8de7-5b01a8faec96\") " pod="openshift-infra/auto-csr-approver-29554576-mrf2j" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.390492 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktkm8\" (UniqueName: \"kubernetes.io/projected/aa0258ba-0167-4413-8de7-5b01a8faec96-kube-api-access-ktkm8\") pod \"auto-csr-approver-29554576-mrf2j\" (UID: \"aa0258ba-0167-4413-8de7-5b01a8faec96\") " pod="openshift-infra/auto-csr-approver-29554576-mrf2j" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.458549 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554576-mrf2j" Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.861015 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554576-mrf2j"] Mar 12 00:16:00 crc kubenswrapper[4870]: I0312 00:16:00.988206 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554576-mrf2j" event={"ID":"aa0258ba-0167-4413-8de7-5b01a8faec96","Type":"ContainerStarted","Data":"8aacfc757648d58b85e4f52fb495ed5525d2c30bf9261df19a7cb955643cfea2"} Mar 12 00:16:03 crc kubenswrapper[4870]: I0312 00:16:03.000647 4870 generic.go:334] "Generic (PLEG): container finished" podID="aa0258ba-0167-4413-8de7-5b01a8faec96" containerID="527647a7d31cbda72fa8e802ea76c84fdf0213032e30a9a5f1c782c1800b1c27" exitCode=0 Mar 12 00:16:03 crc kubenswrapper[4870]: I0312 00:16:03.000700 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554576-mrf2j" event={"ID":"aa0258ba-0167-4413-8de7-5b01a8faec96","Type":"ContainerDied","Data":"527647a7d31cbda72fa8e802ea76c84fdf0213032e30a9a5f1c782c1800b1c27"} Mar 12 00:16:04 crc kubenswrapper[4870]: I0312 00:16:04.370295 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554576-mrf2j" Mar 12 00:16:04 crc kubenswrapper[4870]: I0312 00:16:04.505342 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktkm8\" (UniqueName: \"kubernetes.io/projected/aa0258ba-0167-4413-8de7-5b01a8faec96-kube-api-access-ktkm8\") pod \"aa0258ba-0167-4413-8de7-5b01a8faec96\" (UID: \"aa0258ba-0167-4413-8de7-5b01a8faec96\") " Mar 12 00:16:04 crc kubenswrapper[4870]: I0312 00:16:04.512477 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0258ba-0167-4413-8de7-5b01a8faec96-kube-api-access-ktkm8" (OuterVolumeSpecName: "kube-api-access-ktkm8") pod "aa0258ba-0167-4413-8de7-5b01a8faec96" (UID: "aa0258ba-0167-4413-8de7-5b01a8faec96"). InnerVolumeSpecName "kube-api-access-ktkm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:16:04 crc kubenswrapper[4870]: I0312 00:16:04.606421 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktkm8\" (UniqueName: \"kubernetes.io/projected/aa0258ba-0167-4413-8de7-5b01a8faec96-kube-api-access-ktkm8\") on node \"crc\" DevicePath \"\"" Mar 12 00:16:05 crc kubenswrapper[4870]: I0312 00:16:05.019293 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554576-mrf2j" event={"ID":"aa0258ba-0167-4413-8de7-5b01a8faec96","Type":"ContainerDied","Data":"8aacfc757648d58b85e4f52fb495ed5525d2c30bf9261df19a7cb955643cfea2"} Mar 12 00:16:05 crc kubenswrapper[4870]: I0312 00:16:05.019343 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aacfc757648d58b85e4f52fb495ed5525d2c30bf9261df19a7cb955643cfea2" Mar 12 00:16:05 crc kubenswrapper[4870]: I0312 00:16:05.019393 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554576-mrf2j" Mar 12 00:16:05 crc kubenswrapper[4870]: I0312 00:16:05.454576 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554570-l4btp"] Mar 12 00:16:05 crc kubenswrapper[4870]: I0312 00:16:05.460404 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554570-l4btp"] Mar 12 00:16:06 crc kubenswrapper[4870]: I0312 00:16:06.111899 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf754ba1-52f1-478d-9b07-1d83e55d3020" path="/var/lib/kubelet/pods/cf754ba1-52f1-478d-9b07-1d83e55d3020/volumes" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.155658 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554578-zb5pv"] Mar 12 00:18:00 crc kubenswrapper[4870]: E0312 00:18:00.156715 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0258ba-0167-4413-8de7-5b01a8faec96" containerName="oc" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.156742 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0258ba-0167-4413-8de7-5b01a8faec96" containerName="oc" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.156985 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0258ba-0167-4413-8de7-5b01a8faec96" containerName="oc" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.157724 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554578-zb5pv" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.161362 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9fvj8" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.161709 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.163236 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.177445 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554578-zb5pv"] Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.321566 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqkjn\" (UniqueName: \"kubernetes.io/projected/00368cf7-b70c-425e-843a-f57d1ed13c51-kube-api-access-wqkjn\") pod \"auto-csr-approver-29554578-zb5pv\" (UID: \"00368cf7-b70c-425e-843a-f57d1ed13c51\") " pod="openshift-infra/auto-csr-approver-29554578-zb5pv" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.422997 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqkjn\" (UniqueName: \"kubernetes.io/projected/00368cf7-b70c-425e-843a-f57d1ed13c51-kube-api-access-wqkjn\") pod \"auto-csr-approver-29554578-zb5pv\" (UID: \"00368cf7-b70c-425e-843a-f57d1ed13c51\") " pod="openshift-infra/auto-csr-approver-29554578-zb5pv" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.457623 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqkjn\" (UniqueName: \"kubernetes.io/projected/00368cf7-b70c-425e-843a-f57d1ed13c51-kube-api-access-wqkjn\") pod \"auto-csr-approver-29554578-zb5pv\" (UID: \"00368cf7-b70c-425e-843a-f57d1ed13c51\") " pod="openshift-infra/auto-csr-approver-29554578-zb5pv" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.487830 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554578-zb5pv" Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.767263 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554578-zb5pv"] Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.777879 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 00:18:00 crc kubenswrapper[4870]: I0312 00:18:00.852984 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554578-zb5pv" event={"ID":"00368cf7-b70c-425e-843a-f57d1ed13c51","Type":"ContainerStarted","Data":"5bf36263ce69bdce982c849438664c563bce23402454dd3bd23a2a80b3d0724d"} Mar 12 00:18:02 crc kubenswrapper[4870]: I0312 00:18:02.868926 4870 generic.go:334] "Generic (PLEG): container finished" podID="00368cf7-b70c-425e-843a-f57d1ed13c51" containerID="d89b172eb80071e840719224a556eb15433e50234f22f512951e9715781646ca" exitCode=0 Mar 12 00:18:02 crc kubenswrapper[4870]: I0312 00:18:02.869027 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554578-zb5pv" event={"ID":"00368cf7-b70c-425e-843a-f57d1ed13c51","Type":"ContainerDied","Data":"d89b172eb80071e840719224a556eb15433e50234f22f512951e9715781646ca"} Mar 12 00:18:04 crc kubenswrapper[4870]: I0312 00:18:04.121202 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554578-zb5pv" Mar 12 00:18:04 crc kubenswrapper[4870]: I0312 00:18:04.303808 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqkjn\" (UniqueName: \"kubernetes.io/projected/00368cf7-b70c-425e-843a-f57d1ed13c51-kube-api-access-wqkjn\") pod \"00368cf7-b70c-425e-843a-f57d1ed13c51\" (UID: \"00368cf7-b70c-425e-843a-f57d1ed13c51\") " Mar 12 00:18:04 crc kubenswrapper[4870]: I0312 00:18:04.312989 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00368cf7-b70c-425e-843a-f57d1ed13c51-kube-api-access-wqkjn" (OuterVolumeSpecName: "kube-api-access-wqkjn") pod "00368cf7-b70c-425e-843a-f57d1ed13c51" (UID: "00368cf7-b70c-425e-843a-f57d1ed13c51"). InnerVolumeSpecName "kube-api-access-wqkjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:18:04 crc kubenswrapper[4870]: I0312 00:18:04.406036 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqkjn\" (UniqueName: \"kubernetes.io/projected/00368cf7-b70c-425e-843a-f57d1ed13c51-kube-api-access-wqkjn\") on node \"crc\" DevicePath \"\"" Mar 12 00:18:04 crc kubenswrapper[4870]: I0312 00:18:04.888974 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554578-zb5pv" event={"ID":"00368cf7-b70c-425e-843a-f57d1ed13c51","Type":"ContainerDied","Data":"5bf36263ce69bdce982c849438664c563bce23402454dd3bd23a2a80b3d0724d"} Mar 12 00:18:04 crc kubenswrapper[4870]: I0312 00:18:04.889026 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bf36263ce69bdce982c849438664c563bce23402454dd3bd23a2a80b3d0724d" Mar 12 00:18:04 crc kubenswrapper[4870]: I0312 00:18:04.889079 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554578-zb5pv" Mar 12 00:18:05 crc kubenswrapper[4870]: I0312 00:18:05.198109 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554572-7fms9"] Mar 12 00:18:05 crc kubenswrapper[4870]: I0312 00:18:05.204578 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554572-7fms9"] Mar 12 00:18:06 crc kubenswrapper[4870]: I0312 00:18:06.116985 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec36d635-25f6-4396-9218-6b5aa2c6809b" path="/var/lib/kubelet/pods/ec36d635-25f6-4396-9218-6b5aa2c6809b/volumes" Mar 12 00:18:17 crc kubenswrapper[4870]: I0312 00:18:17.595318 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:18:17 crc kubenswrapper[4870]: I0312 00:18:17.595969 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:18:30 crc kubenswrapper[4870]: I0312 00:18:30.510271 4870 scope.go:117] "RemoveContainer" containerID="b14148f3b729554d1abb5d773802a4d249b27bd29d37af8bdaeb5b80c269258f" Mar 12 00:18:30 crc kubenswrapper[4870]: I0312 00:18:30.548829 4870 scope.go:117] "RemoveContainer" containerID="f5dfcbb3bab0bb4c82fb75499c7cf6d7e84426362dd5384009293fb99e4f45f6" Mar 12 00:18:47 crc kubenswrapper[4870]: I0312 00:18:47.594752 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:18:47 crc kubenswrapper[4870]: I0312 00:18:47.595500 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:19:17 crc kubenswrapper[4870]: I0312 00:19:17.594317 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:19:17 crc kubenswrapper[4870]: I0312 00:19:17.595283 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:19:17 crc kubenswrapper[4870]: I0312 00:19:17.595356 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:19:17 crc kubenswrapper[4870]: I0312 00:19:17.596976 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1741f7c30d6275bdbc591187e9d7f1701084fc4106be15d405493726cd83c068"} pod="openshift-machine-config-operator/machine-config-daemon-84dfr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 00:19:17 crc kubenswrapper[4870]: I0312 00:19:17.597098 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" containerID="cri-o://1741f7c30d6275bdbc591187e9d7f1701084fc4106be15d405493726cd83c068" gracePeriod=600 Mar 12 00:19:18 crc kubenswrapper[4870]: I0312 00:19:18.411681 4870 generic.go:334] "Generic (PLEG): container finished" podID="988c0290-1e98-46c8-8253-a4718914b9ef" containerID="1741f7c30d6275bdbc591187e9d7f1701084fc4106be15d405493726cd83c068" exitCode=0 Mar 12 00:19:18 crc kubenswrapper[4870]: I0312 00:19:18.411774 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerDied","Data":"1741f7c30d6275bdbc591187e9d7f1701084fc4106be15d405493726cd83c068"} Mar 12 00:19:18 crc kubenswrapper[4870]: I0312 00:19:18.412380 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerStarted","Data":"9d34e3dbb71186ce8356c02e5bee2ab1ff708583b71cba126470e3c14ba16321"} Mar 12 00:19:18 crc kubenswrapper[4870]: I0312 00:19:18.412414 4870 scope.go:117] "RemoveContainer" containerID="d8c3058facf9a2b52988623f4f9078fda5941f091e6fa03732a464860e4b1dac" Mar 12 00:19:26 crc kubenswrapper[4870]: I0312 00:19:26.898496 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xwrqb"] Mar 12 00:19:26 crc kubenswrapper[4870]: I0312 00:19:26.899941 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovn-controller" containerID="cri-o://7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a" gracePeriod=30 Mar 12 00:19:26 crc kubenswrapper[4870]: I0312 00:19:26.900063 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9" gracePeriod=30 Mar 12 00:19:26 crc kubenswrapper[4870]: I0312 00:19:26.900133 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="northd" containerID="cri-o://e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf" gracePeriod=30 Mar 12 00:19:26 crc kubenswrapper[4870]: I0312 00:19:26.900063 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="kube-rbac-proxy-node" containerID="cri-o://a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1" gracePeriod=30 Mar 12 00:19:26 crc kubenswrapper[4870]: I0312 00:19:26.900223 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovn-acl-logging" containerID="cri-o://5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7" gracePeriod=30 Mar 12 00:19:26 crc kubenswrapper[4870]: I0312 00:19:26.900311 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="sbdb" containerID="cri-o://2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a" gracePeriod=30 Mar 12 00:19:26 crc kubenswrapper[4870]: I0312 00:19:26.900536 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="nbdb" containerID="cri-o://4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b" gracePeriod=30 Mar 12 00:19:26 crc kubenswrapper[4870]: I0312 00:19:26.948930 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" containerID="cri-o://71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74" gracePeriod=30 Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.255786 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/3.log" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.259837 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovn-acl-logging/0.log" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.260900 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovn-controller/0.log" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.261806 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.278906 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-slash\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279011 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-script-lib\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279061 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-node-log\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279084 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-slash" (OuterVolumeSpecName: "host-slash") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279105 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-netd\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279272 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-openvswitch\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279320 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-etc-openvswitch\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279353 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-log-socket\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279382 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-bin\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279416 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-systemd\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279502 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovn-node-metrics-cert\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279554 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-env-overrides\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279615 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-config\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279660 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-ovn\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279702 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr49h\" (UniqueName: \"kubernetes.io/projected/467385e2-3bbf-4cf0-909a-8e878b5d86dc-kube-api-access-hr49h\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279754 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279806 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-netns\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279840 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-systemd-units\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279879 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-ovn-kubernetes\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279930 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-var-lib-openvswitch\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279974 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-kubelet\") pod \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\" (UID: \"467385e2-3bbf-4cf0-909a-8e878b5d86dc\") " Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.280487 4870 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-slash\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.279268 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.280557 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.280588 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.280657 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.280692 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-log-socket" (OuterVolumeSpecName: "log-socket") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.280727 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.281397 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.281500 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-node-log" (OuterVolumeSpecName: "node-log") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.281552 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.282014 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.282081 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.282105 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.282460 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.282767 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.282923 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.282954 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.298529 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/467385e2-3bbf-4cf0-909a-8e878b5d86dc-kube-api-access-hr49h" (OuterVolumeSpecName: "kube-api-access-hr49h") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "kube-api-access-hr49h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.299509 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.311318 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "467385e2-3bbf-4cf0-909a-8e878b5d86dc" (UID: "467385e2-3bbf-4cf0-909a-8e878b5d86dc"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328347 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t2jhp"] Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328611 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328632 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328641 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="northd" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328653 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="northd" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328666 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="nbdb" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328679 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="nbdb" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328692 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328702 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328720 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328729 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328744 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="kube-rbac-proxy-node" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328882 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="kube-rbac-proxy-node" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328894 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovn-acl-logging" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328903 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovn-acl-logging" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328914 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="sbdb" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328923 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="sbdb" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328936 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00368cf7-b70c-425e-843a-f57d1ed13c51" containerName="oc" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328945 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="00368cf7-b70c-425e-843a-f57d1ed13c51" containerName="oc" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328955 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328963 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328974 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovn-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.328983 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovn-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.328998 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329006 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.329016 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="kubecfg-setup" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329023 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="kubecfg-setup" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329210 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovn-acl-logging" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329233 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="sbdb" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329242 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329252 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="00368cf7-b70c-425e-843a-f57d1ed13c51" containerName="oc" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329263 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="northd" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329273 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329284 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="kube-rbac-proxy-node" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329292 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="nbdb" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329301 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329310 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329321 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329330 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovn-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: E0312 00:19:27.329450 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329459 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.329568 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerName="ovnkube-controller" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.331784 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382206 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-cni-bin\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382249 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-systemd-units\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382277 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-etc-openvswitch\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382296 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-run-openvswitch\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382311 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-run-ovn-kubernetes\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382329 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/672aee0b-4d9e-4466-9906-4b4f00ff7f11-ovnkube-script-lib\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382348 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-run-ovn\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382363 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382385 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-run-systemd\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382403 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/672aee0b-4d9e-4466-9906-4b4f00ff7f11-ovnkube-config\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382419 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-run-netns\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382437 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-kubelet\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382453 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-var-lib-openvswitch\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382470 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-cni-netd\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382489 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/672aee0b-4d9e-4466-9906-4b4f00ff7f11-env-overrides\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382503 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-node-log\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382525 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-log-socket\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382541 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/672aee0b-4d9e-4466-9906-4b4f00ff7f11-ovn-node-metrics-cert\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382558 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fng5p\" (UniqueName: \"kubernetes.io/projected/672aee0b-4d9e-4466-9906-4b4f00ff7f11-kube-api-access-fng5p\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382588 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-slash\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382624 4870 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382637 4870 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382647 4870 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382655 4870 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382666 4870 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382675 4870 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382685 4870 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382697 4870 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-node-log\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382707 4870 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382716 4870 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382724 4870 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382733 4870 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-log-socket\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382741 4870 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382750 4870 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382759 4870 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382770 4870 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382778 4870 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467385e2-3bbf-4cf0-909a-8e878b5d86dc-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382788 4870 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467385e2-3bbf-4cf0-909a-8e878b5d86dc-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.382797 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr49h\" (UniqueName: \"kubernetes.io/projected/467385e2-3bbf-4cf0-909a-8e878b5d86dc-kube-api-access-hr49h\") on node \"crc\" DevicePath \"\"" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.483826 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-cni-bin\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484019 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-systemd-units\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484070 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-etc-openvswitch\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484183 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-run-openvswitch\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484242 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-systemd-units\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484299 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-run-openvswitch\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484258 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-etc-openvswitch\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484297 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-run-ovn-kubernetes\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.483929 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-cni-bin\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484438 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-run-ovn-kubernetes\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484442 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/672aee0b-4d9e-4466-9906-4b4f00ff7f11-ovnkube-script-lib\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484630 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-run-ovn\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484669 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484724 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-run-systemd\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484745 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-run-ovn\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484774 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/672aee0b-4d9e-4466-9906-4b4f00ff7f11-ovnkube-config\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484799 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-run-netns\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484827 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-kubelet\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484828 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484848 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-var-lib-openvswitch\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484875 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-cni-netd\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484906 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/672aee0b-4d9e-4466-9906-4b4f00ff7f11-env-overrides\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484930 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-node-log\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.484977 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-log-socket\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485003 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/672aee0b-4d9e-4466-9906-4b4f00ff7f11-ovn-node-metrics-cert\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485025 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fng5p\" (UniqueName: \"kubernetes.io/projected/672aee0b-4d9e-4466-9906-4b4f00ff7f11-kube-api-access-fng5p\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485118 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-slash\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485195 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-run-systemd\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485199 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-cni-netd\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485283 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-kubelet\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485244 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-run-netns\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485287 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-var-lib-openvswitch\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485271 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-host-slash\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485206 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-log-socket\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485324 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/672aee0b-4d9e-4466-9906-4b4f00ff7f11-node-log\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485762 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/672aee0b-4d9e-4466-9906-4b4f00ff7f11-ovnkube-script-lib\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485762 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/672aee0b-4d9e-4466-9906-4b4f00ff7f11-ovnkube-config\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.485915 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/672aee0b-4d9e-4466-9906-4b4f00ff7f11-env-overrides\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.492112 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/672aee0b-4d9e-4466-9906-4b4f00ff7f11-ovn-node-metrics-cert\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.509848 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fng5p\" (UniqueName: \"kubernetes.io/projected/672aee0b-4d9e-4466-9906-4b4f00ff7f11-kube-api-access-fng5p\") pod \"ovnkube-node-t2jhp\" (UID: \"672aee0b-4d9e-4466-9906-4b4f00ff7f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.649517 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.987088 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovnkube-controller/3.log" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.990396 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovn-acl-logging/0.log" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.990999 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xwrqb_467385e2-3bbf-4cf0-909a-8e878b5d86dc/ovn-controller/0.log" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.991767 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74" exitCode=0 Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.991810 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a" exitCode=0 Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.991828 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b" exitCode=0 Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.991845 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf" exitCode=0 Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.991860 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9" exitCode=0 Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.991874 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1" exitCode=0 Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.991886 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7" exitCode=143 Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.991899 4870 generic.go:334] "Generic (PLEG): container finished" podID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" containerID="7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a" exitCode=143 Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.991963 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992003 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992025 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992046 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992065 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992090 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992109 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992125 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992136 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992204 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992220 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992232 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992243 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992255 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992267 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992285 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992304 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992317 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992329 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992341 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992352 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992364 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992374 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992385 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992396 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992407 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992424 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992441 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992453 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992464 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992476 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992487 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992500 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992511 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992523 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992533 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992543 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992558 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" event={"ID":"467385e2-3bbf-4cf0-909a-8e878b5d86dc","Type":"ContainerDied","Data":"eeb75b161617173fc75c006eb224489e05a0f4b000f626bfea176014320c34c1"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992574 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992587 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992600 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992612 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992623 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992634 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992644 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992654 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992665 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992675 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992698 4870 scope.go:117] "RemoveContainer" containerID="71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.992952 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xwrqb" Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.996034 4870 generic.go:334] "Generic (PLEG): container finished" podID="672aee0b-4d9e-4466-9906-4b4f00ff7f11" containerID="ad97e0e62bc07306b7baf476e6690561e248104090854fa70e0c31238b090544" exitCode=0 Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.996121 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" event={"ID":"672aee0b-4d9e-4466-9906-4b4f00ff7f11","Type":"ContainerDied","Data":"ad97e0e62bc07306b7baf476e6690561e248104090854fa70e0c31238b090544"} Mar 12 00:19:27 crc kubenswrapper[4870]: I0312 00:19:27.996220 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" event={"ID":"672aee0b-4d9e-4466-9906-4b4f00ff7f11","Type":"ContainerStarted","Data":"805079d46a2a658d1cf4e583745810c1841cc7f570a9384cd272a1d1349382ac"} Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.001848 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hngl_2ad1e98a-cb66-436d-8e5e-301724f70769/kube-multus/2.log" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.002818 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hngl_2ad1e98a-cb66-436d-8e5e-301724f70769/kube-multus/1.log" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.002862 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ad1e98a-cb66-436d-8e5e-301724f70769" containerID="52b5a384822516956958b8eb4a6f0f514a9febbe684f59ae926459ef7203c441" exitCode=2 Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.002894 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hngl" event={"ID":"2ad1e98a-cb66-436d-8e5e-301724f70769","Type":"ContainerDied","Data":"52b5a384822516956958b8eb4a6f0f514a9febbe684f59ae926459ef7203c441"} Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.002916 4870 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae"} Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.005319 4870 scope.go:117] "RemoveContainer" containerID="52b5a384822516956958b8eb4a6f0f514a9febbe684f59ae926459ef7203c441" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.005501 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8hngl_openshift-multus(2ad1e98a-cb66-436d-8e5e-301724f70769)\"" pod="openshift-multus/multus-8hngl" podUID="2ad1e98a-cb66-436d-8e5e-301724f70769" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.035507 4870 scope.go:117] "RemoveContainer" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.103352 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xwrqb"] Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.104311 4870 scope.go:117] "RemoveContainer" containerID="2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.117403 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xwrqb"] Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.135196 4870 scope.go:117] "RemoveContainer" containerID="4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.152898 4870 scope.go:117] "RemoveContainer" containerID="e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.167172 4870 scope.go:117] "RemoveContainer" containerID="dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.189802 4870 scope.go:117] "RemoveContainer" containerID="a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.218460 4870 scope.go:117] "RemoveContainer" containerID="5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.241449 4870 scope.go:117] "RemoveContainer" containerID="7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.292250 4870 scope.go:117] "RemoveContainer" containerID="56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.313600 4870 scope.go:117] "RemoveContainer" containerID="71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.314277 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": container with ID starting with 71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74 not found: ID does not exist" containerID="71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.314338 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74"} err="failed to get container status \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": rpc error: code = NotFound desc = could not find container \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": container with ID starting with 71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.314407 4870 scope.go:117] "RemoveContainer" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.315024 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\": container with ID starting with 65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6 not found: ID does not exist" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.315067 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6"} err="failed to get container status \"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\": rpc error: code = NotFound desc = could not find container \"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\": container with ID starting with 65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.315098 4870 scope.go:117] "RemoveContainer" containerID="2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.315583 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\": container with ID starting with 2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a not found: ID does not exist" containerID="2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.315654 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a"} err="failed to get container status \"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\": rpc error: code = NotFound desc = could not find container \"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\": container with ID starting with 2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.315703 4870 scope.go:117] "RemoveContainer" containerID="4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.316338 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\": container with ID starting with 4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b not found: ID does not exist" containerID="4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.316384 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b"} err="failed to get container status \"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\": rpc error: code = NotFound desc = could not find container \"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\": container with ID starting with 4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.316407 4870 scope.go:117] "RemoveContainer" containerID="e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.316774 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\": container with ID starting with e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf not found: ID does not exist" containerID="e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.316849 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf"} err="failed to get container status \"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\": rpc error: code = NotFound desc = could not find container \"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\": container with ID starting with e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.316896 4870 scope.go:117] "RemoveContainer" containerID="dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.317791 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\": container with ID starting with dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9 not found: ID does not exist" containerID="dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.317835 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9"} err="failed to get container status \"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\": rpc error: code = NotFound desc = could not find container \"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\": container with ID starting with dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.317858 4870 scope.go:117] "RemoveContainer" containerID="a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.318386 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\": container with ID starting with a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1 not found: ID does not exist" containerID="a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.318422 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1"} err="failed to get container status \"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\": rpc error: code = NotFound desc = could not find container \"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\": container with ID starting with a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.318442 4870 scope.go:117] "RemoveContainer" containerID="5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.318860 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\": container with ID starting with 5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7 not found: ID does not exist" containerID="5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.318929 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7"} err="failed to get container status \"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\": rpc error: code = NotFound desc = could not find container \"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\": container with ID starting with 5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.318970 4870 scope.go:117] "RemoveContainer" containerID="7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.319400 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\": container with ID starting with 7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a not found: ID does not exist" containerID="7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.319451 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a"} err="failed to get container status \"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\": rpc error: code = NotFound desc = could not find container \"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\": container with ID starting with 7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.319483 4870 scope.go:117] "RemoveContainer" containerID="56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5" Mar 12 00:19:28 crc kubenswrapper[4870]: E0312 00:19:28.319889 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\": container with ID starting with 56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5 not found: ID does not exist" containerID="56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.319928 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5"} err="failed to get container status \"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\": rpc error: code = NotFound desc = could not find container \"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\": container with ID starting with 56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.319953 4870 scope.go:117] "RemoveContainer" containerID="71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.320358 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74"} err="failed to get container status \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": rpc error: code = NotFound desc = could not find container \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": container with ID starting with 71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.320388 4870 scope.go:117] "RemoveContainer" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.320747 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6"} err="failed to get container status \"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\": rpc error: code = NotFound desc = could not find container \"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\": container with ID starting with 65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.320776 4870 scope.go:117] "RemoveContainer" containerID="2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.321136 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a"} err="failed to get container status \"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\": rpc error: code = NotFound desc = could not find container \"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\": container with ID starting with 2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.321228 4870 scope.go:117] "RemoveContainer" containerID="4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.321616 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b"} err="failed to get container status \"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\": rpc error: code = NotFound desc = could not find container \"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\": container with ID starting with 4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.321642 4870 scope.go:117] "RemoveContainer" containerID="e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.322031 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf"} err="failed to get container status \"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\": rpc error: code = NotFound desc = could not find container \"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\": container with ID starting with e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.322060 4870 scope.go:117] "RemoveContainer" containerID="dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.322375 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9"} err="failed to get container status \"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\": rpc error: code = NotFound desc = could not find container \"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\": container with ID starting with dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.322415 4870 scope.go:117] "RemoveContainer" containerID="a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.322754 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1"} err="failed to get container status \"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\": rpc error: code = NotFound desc = could not find container \"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\": container with ID starting with a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.322784 4870 scope.go:117] "RemoveContainer" containerID="5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.323205 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7"} err="failed to get container status \"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\": rpc error: code = NotFound desc = could not find container \"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\": container with ID starting with 5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.323238 4870 scope.go:117] "RemoveContainer" containerID="7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.323538 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a"} err="failed to get container status \"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\": rpc error: code = NotFound desc = could not find container \"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\": container with ID starting with 7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.323562 4870 scope.go:117] "RemoveContainer" containerID="56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.323996 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5"} err="failed to get container status \"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\": rpc error: code = NotFound desc = could not find container \"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\": container with ID starting with 56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.324020 4870 scope.go:117] "RemoveContainer" containerID="71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.324387 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74"} err="failed to get container status \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": rpc error: code = NotFound desc = could not find container \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": container with ID starting with 71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.324417 4870 scope.go:117] "RemoveContainer" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.324730 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6"} err="failed to get container status \"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\": rpc error: code = NotFound desc = could not find container \"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\": container with ID starting with 65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.324759 4870 scope.go:117] "RemoveContainer" containerID="2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.325182 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a"} err="failed to get container status \"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\": rpc error: code = NotFound desc = could not find container \"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\": container with ID starting with 2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.325219 4870 scope.go:117] "RemoveContainer" containerID="4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.325489 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b"} err="failed to get container status \"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\": rpc error: code = NotFound desc = could not find container \"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\": container with ID starting with 4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.325515 4870 scope.go:117] "RemoveContainer" containerID="e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.325869 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf"} err="failed to get container status \"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\": rpc error: code = NotFound desc = could not find container \"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\": container with ID starting with e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.325890 4870 scope.go:117] "RemoveContainer" containerID="dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.326283 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9"} err="failed to get container status \"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\": rpc error: code = NotFound desc = could not find container \"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\": container with ID starting with dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.326306 4870 scope.go:117] "RemoveContainer" containerID="a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.326665 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1"} err="failed to get container status \"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\": rpc error: code = NotFound desc = could not find container \"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\": container with ID starting with a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.326697 4870 scope.go:117] "RemoveContainer" containerID="5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.327071 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7"} err="failed to get container status \"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\": rpc error: code = NotFound desc = could not find container \"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\": container with ID starting with 5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.327103 4870 scope.go:117] "RemoveContainer" containerID="7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.327501 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a"} err="failed to get container status \"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\": rpc error: code = NotFound desc = could not find container \"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\": container with ID starting with 7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.327526 4870 scope.go:117] "RemoveContainer" containerID="56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.327961 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5"} err="failed to get container status \"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\": rpc error: code = NotFound desc = could not find container \"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\": container with ID starting with 56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.328003 4870 scope.go:117] "RemoveContainer" containerID="71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.328460 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74"} err="failed to get container status \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": rpc error: code = NotFound desc = could not find container \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": container with ID starting with 71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.328484 4870 scope.go:117] "RemoveContainer" containerID="65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.328811 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6"} err="failed to get container status \"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\": rpc error: code = NotFound desc = could not find container \"65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6\": container with ID starting with 65b777848ca3160e6119d7184d68723050e6320a9ec33736230902f047a557f6 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.328846 4870 scope.go:117] "RemoveContainer" containerID="2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.329277 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a"} err="failed to get container status \"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\": rpc error: code = NotFound desc = could not find container \"2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a\": container with ID starting with 2fc4403fa10cae38f724d0e6e2019a2e36c2207c71bc588eaec71fe71d338a7a not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.329320 4870 scope.go:117] "RemoveContainer" containerID="4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.329841 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b"} err="failed to get container status \"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\": rpc error: code = NotFound desc = could not find container \"4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b\": container with ID starting with 4028769e1210892c2cdb3c27c487a75d712a3f4d06282f97d190d351683cd89b not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.329879 4870 scope.go:117] "RemoveContainer" containerID="e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.330304 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf"} err="failed to get container status \"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\": rpc error: code = NotFound desc = could not find container \"e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf\": container with ID starting with e7e5b29f5b2c93ea966c4d122138b6ba826ad56e37d2ac74c3667193d6c4fabf not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.330331 4870 scope.go:117] "RemoveContainer" containerID="dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.330685 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9"} err="failed to get container status \"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\": rpc error: code = NotFound desc = could not find container \"dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9\": container with ID starting with dbf9fef0504ebe0d34c65db5edc30a200ad358307d0cd342fd34972cd35fc6f9 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.330708 4870 scope.go:117] "RemoveContainer" containerID="a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.331001 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1"} err="failed to get container status \"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\": rpc error: code = NotFound desc = could not find container \"a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1\": container with ID starting with a733e6656e86aa61faa830625937d73e21f5d204e2f60aba62b01a5760bcd0c1 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.331031 4870 scope.go:117] "RemoveContainer" containerID="5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.331310 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7"} err="failed to get container status \"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\": rpc error: code = NotFound desc = could not find container \"5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7\": container with ID starting with 5cd6789ca9e7c3c02ceedc7adb589dcf4119f8e4ad373937c896afa0ff3b04c7 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.331345 4870 scope.go:117] "RemoveContainer" containerID="7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.331768 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a"} err="failed to get container status \"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\": rpc error: code = NotFound desc = could not find container \"7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a\": container with ID starting with 7bc3afb3fdbda88fd780a5688591cb9707429b3e403d6525b4241b5a2e16ca4a not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.331794 4870 scope.go:117] "RemoveContainer" containerID="56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.332099 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5"} err="failed to get container status \"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\": rpc error: code = NotFound desc = could not find container \"56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5\": container with ID starting with 56aa5ec25349c9d684096c807f43d40bfe4012611ed211f40cb356799f3d24e5 not found: ID does not exist" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.332119 4870 scope.go:117] "RemoveContainer" containerID="71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74" Mar 12 00:19:28 crc kubenswrapper[4870]: I0312 00:19:28.332532 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74"} err="failed to get container status \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": rpc error: code = NotFound desc = could not find container \"71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74\": container with ID starting with 71449cd796cca79b6afc65995f00a4cc4cc210800a376b676bda98b458342e74 not found: ID does not exist" Mar 12 00:19:29 crc kubenswrapper[4870]: I0312 00:19:29.014406 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" event={"ID":"672aee0b-4d9e-4466-9906-4b4f00ff7f11","Type":"ContainerStarted","Data":"a8ccac627da8641ee76713821130b187c8361d0dc5587cde904acc3b50f0f61f"} Mar 12 00:19:29 crc kubenswrapper[4870]: I0312 00:19:29.014726 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" event={"ID":"672aee0b-4d9e-4466-9906-4b4f00ff7f11","Type":"ContainerStarted","Data":"1a81bf336f16096c810785fc6a4147df27d0d96fe828ea0b9422f3b9bb2821e2"} Mar 12 00:19:29 crc kubenswrapper[4870]: I0312 00:19:29.014752 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" event={"ID":"672aee0b-4d9e-4466-9906-4b4f00ff7f11","Type":"ContainerStarted","Data":"4d3c5e1fe9c47f73fb19404af55412e975c7783ee7e568c8351e5ac859359e94"} Mar 12 00:19:29 crc kubenswrapper[4870]: I0312 00:19:29.014771 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" event={"ID":"672aee0b-4d9e-4466-9906-4b4f00ff7f11","Type":"ContainerStarted","Data":"1e8a7a6458dcc1cf4787a81fb3116d7c0b636f2fb0711ffb740aac18a273b06e"} Mar 12 00:19:29 crc kubenswrapper[4870]: I0312 00:19:29.014791 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" event={"ID":"672aee0b-4d9e-4466-9906-4b4f00ff7f11","Type":"ContainerStarted","Data":"4247e107e13dc19c8e96a96b8d629e99189744a8ea05032bd9a1ecee2554c14f"} Mar 12 00:19:29 crc kubenswrapper[4870]: I0312 00:19:29.014810 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" event={"ID":"672aee0b-4d9e-4466-9906-4b4f00ff7f11","Type":"ContainerStarted","Data":"828f45031069b984e53bc0128552fba31bd5ae3c075b9ee8eacf07042fff81b6"} Mar 12 00:19:30 crc kubenswrapper[4870]: I0312 00:19:30.122465 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="467385e2-3bbf-4cf0-909a-8e878b5d86dc" path="/var/lib/kubelet/pods/467385e2-3bbf-4cf0-909a-8e878b5d86dc/volumes" Mar 12 00:19:30 crc kubenswrapper[4870]: I0312 00:19:30.617351 4870 scope.go:117] "RemoveContainer" containerID="c8c490f8ffe8abc8d1d850c770b06932babeec8791662b90c49dafd04b7c61ae" Mar 12 00:19:31 crc kubenswrapper[4870]: I0312 00:19:31.030782 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hngl_2ad1e98a-cb66-436d-8e5e-301724f70769/kube-multus/2.log" Mar 12 00:19:32 crc kubenswrapper[4870]: I0312 00:19:32.045771 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" event={"ID":"672aee0b-4d9e-4466-9906-4b4f00ff7f11","Type":"ContainerStarted","Data":"e552cedda54019e11df09e95822bfed003a8543636edf3c6050f137592889dce"} Mar 12 00:19:34 crc kubenswrapper[4870]: I0312 00:19:34.062196 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" event={"ID":"672aee0b-4d9e-4466-9906-4b4f00ff7f11","Type":"ContainerStarted","Data":"a26d90c11236425ce86de5034ce07aca922d6ee7eefdcefb4eb6921ca147451b"} Mar 12 00:19:34 crc kubenswrapper[4870]: I0312 00:19:34.062627 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:34 crc kubenswrapper[4870]: I0312 00:19:34.062642 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:34 crc kubenswrapper[4870]: I0312 00:19:34.062652 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:34 crc kubenswrapper[4870]: I0312 00:19:34.095939 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:34 crc kubenswrapper[4870]: I0312 00:19:34.098383 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:19:34 crc kubenswrapper[4870]: I0312 00:19:34.135689 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" podStartSLOduration=7.135672364 podStartE2EDuration="7.135672364s" podCreationTimestamp="2026-03-12 00:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 00:19:34.090973058 +0000 UTC m=+664.694389388" watchObservedRunningTime="2026-03-12 00:19:34.135672364 +0000 UTC m=+664.739088674" Mar 12 00:19:41 crc kubenswrapper[4870]: I0312 00:19:41.105572 4870 scope.go:117] "RemoveContainer" containerID="52b5a384822516956958b8eb4a6f0f514a9febbe684f59ae926459ef7203c441" Mar 12 00:19:41 crc kubenswrapper[4870]: E0312 00:19:41.106426 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8hngl_openshift-multus(2ad1e98a-cb66-436d-8e5e-301724f70769)\"" pod="openshift-multus/multus-8hngl" podUID="2ad1e98a-cb66-436d-8e5e-301724f70769" Mar 12 00:19:52 crc kubenswrapper[4870]: I0312 00:19:52.104703 4870 scope.go:117] "RemoveContainer" containerID="52b5a384822516956958b8eb4a6f0f514a9febbe684f59ae926459ef7203c441" Mar 12 00:19:53 crc kubenswrapper[4870]: I0312 00:19:53.197779 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hngl_2ad1e98a-cb66-436d-8e5e-301724f70769/kube-multus/2.log" Mar 12 00:19:53 crc kubenswrapper[4870]: I0312 00:19:53.197848 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hngl" event={"ID":"2ad1e98a-cb66-436d-8e5e-301724f70769","Type":"ContainerStarted","Data":"83f79c20940daf13137a6e84816ab809db0ef8525b33e256bb51f7ef969010bf"} Mar 12 00:19:57 crc kubenswrapper[4870]: I0312 00:19:57.683072 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t2jhp" Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.153999 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554580-j4sw5"] Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.155308 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554580-j4sw5" Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.157769 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.160113 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.162082 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9fvj8" Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.167974 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554580-j4sw5"] Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.169691 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw7pd\" (UniqueName: \"kubernetes.io/projected/2870ede9-9765-4376-a848-1e2721d3f95c-kube-api-access-lw7pd\") pod \"auto-csr-approver-29554580-j4sw5\" (UID: \"2870ede9-9765-4376-a848-1e2721d3f95c\") " pod="openshift-infra/auto-csr-approver-29554580-j4sw5" Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.271790 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw7pd\" (UniqueName: \"kubernetes.io/projected/2870ede9-9765-4376-a848-1e2721d3f95c-kube-api-access-lw7pd\") pod \"auto-csr-approver-29554580-j4sw5\" (UID: \"2870ede9-9765-4376-a848-1e2721d3f95c\") " pod="openshift-infra/auto-csr-approver-29554580-j4sw5" Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.294977 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw7pd\" (UniqueName: \"kubernetes.io/projected/2870ede9-9765-4376-a848-1e2721d3f95c-kube-api-access-lw7pd\") pod \"auto-csr-approver-29554580-j4sw5\" (UID: \"2870ede9-9765-4376-a848-1e2721d3f95c\") " pod="openshift-infra/auto-csr-approver-29554580-j4sw5" Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.508468 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554580-j4sw5" Mar 12 00:20:00 crc kubenswrapper[4870]: I0312 00:20:00.770803 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554580-j4sw5"] Mar 12 00:20:00 crc kubenswrapper[4870]: W0312 00:20:00.779000 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2870ede9_9765_4376_a848_1e2721d3f95c.slice/crio-05b892354d9e5ffc2954e59e3f07e6a6f78efeb34f476691b7d5e71e4a0f7021 WatchSource:0}: Error finding container 05b892354d9e5ffc2954e59e3f07e6a6f78efeb34f476691b7d5e71e4a0f7021: Status 404 returned error can't find the container with id 05b892354d9e5ffc2954e59e3f07e6a6f78efeb34f476691b7d5e71e4a0f7021 Mar 12 00:20:01 crc kubenswrapper[4870]: I0312 00:20:01.252564 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554580-j4sw5" event={"ID":"2870ede9-9765-4376-a848-1e2721d3f95c","Type":"ContainerStarted","Data":"05b892354d9e5ffc2954e59e3f07e6a6f78efeb34f476691b7d5e71e4a0f7021"} Mar 12 00:20:03 crc kubenswrapper[4870]: I0312 00:20:03.281307 4870 generic.go:334] "Generic (PLEG): container finished" podID="2870ede9-9765-4376-a848-1e2721d3f95c" containerID="6bc7e9a83d050e78e320633876851f16890ac1132c9a1b4162c3e894a9d4be89" exitCode=0 Mar 12 00:20:03 crc kubenswrapper[4870]: I0312 00:20:03.281382 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554580-j4sw5" event={"ID":"2870ede9-9765-4376-a848-1e2721d3f95c","Type":"ContainerDied","Data":"6bc7e9a83d050e78e320633876851f16890ac1132c9a1b4162c3e894a9d4be89"} Mar 12 00:20:04 crc kubenswrapper[4870]: I0312 00:20:04.557922 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554580-j4sw5" Mar 12 00:20:04 crc kubenswrapper[4870]: I0312 00:20:04.740320 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw7pd\" (UniqueName: \"kubernetes.io/projected/2870ede9-9765-4376-a848-1e2721d3f95c-kube-api-access-lw7pd\") pod \"2870ede9-9765-4376-a848-1e2721d3f95c\" (UID: \"2870ede9-9765-4376-a848-1e2721d3f95c\") " Mar 12 00:20:04 crc kubenswrapper[4870]: I0312 00:20:04.745401 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2870ede9-9765-4376-a848-1e2721d3f95c-kube-api-access-lw7pd" (OuterVolumeSpecName: "kube-api-access-lw7pd") pod "2870ede9-9765-4376-a848-1e2721d3f95c" (UID: "2870ede9-9765-4376-a848-1e2721d3f95c"). InnerVolumeSpecName "kube-api-access-lw7pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:20:04 crc kubenswrapper[4870]: I0312 00:20:04.842485 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw7pd\" (UniqueName: \"kubernetes.io/projected/2870ede9-9765-4376-a848-1e2721d3f95c-kube-api-access-lw7pd\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:05 crc kubenswrapper[4870]: I0312 00:20:05.297413 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554580-j4sw5" event={"ID":"2870ede9-9765-4376-a848-1e2721d3f95c","Type":"ContainerDied","Data":"05b892354d9e5ffc2954e59e3f07e6a6f78efeb34f476691b7d5e71e4a0f7021"} Mar 12 00:20:05 crc kubenswrapper[4870]: I0312 00:20:05.297472 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05b892354d9e5ffc2954e59e3f07e6a6f78efeb34f476691b7d5e71e4a0f7021" Mar 12 00:20:05 crc kubenswrapper[4870]: I0312 00:20:05.297498 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554580-j4sw5" Mar 12 00:20:05 crc kubenswrapper[4870]: I0312 00:20:05.635209 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554574-m2sqq"] Mar 12 00:20:05 crc kubenswrapper[4870]: I0312 00:20:05.642706 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554574-m2sqq"] Mar 12 00:20:06 crc kubenswrapper[4870]: I0312 00:20:06.115400 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f600df0-7365-49d2-ba00-9747953def68" path="/var/lib/kubelet/pods/8f600df0-7365-49d2-ba00-9747953def68/volumes" Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.353468 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2z2k"] Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.362576 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m2z2k" podUID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerName="registry-server" containerID="cri-o://6c4b0a5328ece4a44202d6b92ad1e0aa504d8dceaae5d7b9d22d21ee84ab386c" gracePeriod=30 Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.495462 4870 generic.go:334] "Generic (PLEG): container finished" podID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerID="6c4b0a5328ece4a44202d6b92ad1e0aa504d8dceaae5d7b9d22d21ee84ab386c" exitCode=0 Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.495510 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2z2k" event={"ID":"bafff646-ca93-422c-8f5d-e0f30e852b71","Type":"ContainerDied","Data":"6c4b0a5328ece4a44202d6b92ad1e0aa504d8dceaae5d7b9d22d21ee84ab386c"} Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.746604 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.869953 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-catalog-content\") pod \"bafff646-ca93-422c-8f5d-e0f30e852b71\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.870351 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgf4z\" (UniqueName: \"kubernetes.io/projected/bafff646-ca93-422c-8f5d-e0f30e852b71-kube-api-access-rgf4z\") pod \"bafff646-ca93-422c-8f5d-e0f30e852b71\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.870497 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-utilities\") pod \"bafff646-ca93-422c-8f5d-e0f30e852b71\" (UID: \"bafff646-ca93-422c-8f5d-e0f30e852b71\") " Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.872692 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-utilities" (OuterVolumeSpecName: "utilities") pod "bafff646-ca93-422c-8f5d-e0f30e852b71" (UID: "bafff646-ca93-422c-8f5d-e0f30e852b71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.880322 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafff646-ca93-422c-8f5d-e0f30e852b71-kube-api-access-rgf4z" (OuterVolumeSpecName: "kube-api-access-rgf4z") pod "bafff646-ca93-422c-8f5d-e0f30e852b71" (UID: "bafff646-ca93-422c-8f5d-e0f30e852b71"). InnerVolumeSpecName "kube-api-access-rgf4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.892712 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bafff646-ca93-422c-8f5d-e0f30e852b71" (UID: "bafff646-ca93-422c-8f5d-e0f30e852b71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.972093 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.972190 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafff646-ca93-422c-8f5d-e0f30e852b71-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:35 crc kubenswrapper[4870]: I0312 00:20:35.972220 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgf4z\" (UniqueName: \"kubernetes.io/projected/bafff646-ca93-422c-8f5d-e0f30e852b71-kube-api-access-rgf4z\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:36 crc kubenswrapper[4870]: I0312 00:20:36.502208 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2z2k" event={"ID":"bafff646-ca93-422c-8f5d-e0f30e852b71","Type":"ContainerDied","Data":"3cf6d4864b6a635ab951e83ded94a818f4c111cb5fcba861f9eff49967b88cbe"} Mar 12 00:20:36 crc kubenswrapper[4870]: I0312 00:20:36.502276 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2z2k" Mar 12 00:20:36 crc kubenswrapper[4870]: I0312 00:20:36.502292 4870 scope.go:117] "RemoveContainer" containerID="6c4b0a5328ece4a44202d6b92ad1e0aa504d8dceaae5d7b9d22d21ee84ab386c" Mar 12 00:20:36 crc kubenswrapper[4870]: I0312 00:20:36.520676 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2z2k"] Mar 12 00:20:36 crc kubenswrapper[4870]: I0312 00:20:36.523690 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2z2k"] Mar 12 00:20:36 crc kubenswrapper[4870]: I0312 00:20:36.530933 4870 scope.go:117] "RemoveContainer" containerID="ef4d7369d6e64b07e704a9642a17d77787421d298dc420a5053f8298bf5f9e96" Mar 12 00:20:36 crc kubenswrapper[4870]: I0312 00:20:36.557044 4870 scope.go:117] "RemoveContainer" containerID="023df00116809cf17a0e9a819b147a660cb3e668f5788ec0e8c987ce1a7a2535" Mar 12 00:20:38 crc kubenswrapper[4870]: I0312 00:20:38.111722 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafff646-ca93-422c-8f5d-e0f30e852b71" path="/var/lib/kubelet/pods/bafff646-ca93-422c-8f5d-e0f30e852b71/volumes" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.144136 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d"] Mar 12 00:20:39 crc kubenswrapper[4870]: E0312 00:20:39.144382 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerName="registry-server" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.144396 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerName="registry-server" Mar 12 00:20:39 crc kubenswrapper[4870]: E0312 00:20:39.144420 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2870ede9-9765-4376-a848-1e2721d3f95c" containerName="oc" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.144429 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2870ede9-9765-4376-a848-1e2721d3f95c" containerName="oc" Mar 12 00:20:39 crc kubenswrapper[4870]: E0312 00:20:39.144446 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerName="extract-utilities" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.144454 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerName="extract-utilities" Mar 12 00:20:39 crc kubenswrapper[4870]: E0312 00:20:39.144462 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerName="extract-content" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.144469 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerName="extract-content" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.144569 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2870ede9-9765-4376-a848-1e2721d3f95c" containerName="oc" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.144581 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafff646-ca93-422c-8f5d-e0f30e852b71" containerName="registry-server" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.145415 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.147764 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.151472 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d"] Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.220804 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.220857 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvppz\" (UniqueName: \"kubernetes.io/projected/c8c66009-c5f6-417d-8250-90e3b514c5ab-kube-api-access-fvppz\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.220893 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.321731 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.321788 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvppz\" (UniqueName: \"kubernetes.io/projected/c8c66009-c5f6-417d-8250-90e3b514c5ab-kube-api-access-fvppz\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.321831 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.322431 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.322474 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.345969 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvppz\" (UniqueName: \"kubernetes.io/projected/c8c66009-c5f6-417d-8250-90e3b514c5ab-kube-api-access-fvppz\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.475065 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:39 crc kubenswrapper[4870]: I0312 00:20:39.727235 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d"] Mar 12 00:20:40 crc kubenswrapper[4870]: I0312 00:20:40.535367 4870 generic.go:334] "Generic (PLEG): container finished" podID="c8c66009-c5f6-417d-8250-90e3b514c5ab" containerID="dcd2726f76515402e5fdda4919e020b7f7f57cfd74be862c62c8819ef5359555" exitCode=0 Mar 12 00:20:40 crc kubenswrapper[4870]: I0312 00:20:40.536337 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" event={"ID":"c8c66009-c5f6-417d-8250-90e3b514c5ab","Type":"ContainerDied","Data":"dcd2726f76515402e5fdda4919e020b7f7f57cfd74be862c62c8819ef5359555"} Mar 12 00:20:40 crc kubenswrapper[4870]: I0312 00:20:40.536406 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" event={"ID":"c8c66009-c5f6-417d-8250-90e3b514c5ab","Type":"ContainerStarted","Data":"f21610d088e7f4bf1e97c80daec55e0eac517120af35fd657dc90cb25e061e05"} Mar 12 00:20:41 crc kubenswrapper[4870]: I0312 00:20:41.542090 4870 generic.go:334] "Generic (PLEG): container finished" podID="c8c66009-c5f6-417d-8250-90e3b514c5ab" containerID="d16f45ca27f6efaff4d526d6e73e0a8737843b2b7f660476f87cab4dadea27b4" exitCode=0 Mar 12 00:20:41 crc kubenswrapper[4870]: I0312 00:20:41.542174 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" event={"ID":"c8c66009-c5f6-417d-8250-90e3b514c5ab","Type":"ContainerDied","Data":"d16f45ca27f6efaff4d526d6e73e0a8737843b2b7f660476f87cab4dadea27b4"} Mar 12 00:20:42 crc kubenswrapper[4870]: I0312 00:20:42.551565 4870 generic.go:334] "Generic (PLEG): container finished" podID="c8c66009-c5f6-417d-8250-90e3b514c5ab" containerID="5cb5e9efa9327963e4ed44f3af5fb299ed885f85ac7210afafcc3f312696de09" exitCode=0 Mar 12 00:20:42 crc kubenswrapper[4870]: I0312 00:20:42.551625 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" event={"ID":"c8c66009-c5f6-417d-8250-90e3b514c5ab","Type":"ContainerDied","Data":"5cb5e9efa9327963e4ed44f3af5fb299ed885f85ac7210afafcc3f312696de09"} Mar 12 00:20:43 crc kubenswrapper[4870]: I0312 00:20:43.914403 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.080933 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-bundle\") pod \"c8c66009-c5f6-417d-8250-90e3b514c5ab\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.081242 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-util\") pod \"c8c66009-c5f6-417d-8250-90e3b514c5ab\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.081458 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvppz\" (UniqueName: \"kubernetes.io/projected/c8c66009-c5f6-417d-8250-90e3b514c5ab-kube-api-access-fvppz\") pod \"c8c66009-c5f6-417d-8250-90e3b514c5ab\" (UID: \"c8c66009-c5f6-417d-8250-90e3b514c5ab\") " Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.085795 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-bundle" (OuterVolumeSpecName: "bundle") pod "c8c66009-c5f6-417d-8250-90e3b514c5ab" (UID: "c8c66009-c5f6-417d-8250-90e3b514c5ab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.086640 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c66009-c5f6-417d-8250-90e3b514c5ab-kube-api-access-fvppz" (OuterVolumeSpecName: "kube-api-access-fvppz") pod "c8c66009-c5f6-417d-8250-90e3b514c5ab" (UID: "c8c66009-c5f6-417d-8250-90e3b514c5ab"). InnerVolumeSpecName "kube-api-access-fvppz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.111961 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-util" (OuterVolumeSpecName: "util") pod "c8c66009-c5f6-417d-8250-90e3b514c5ab" (UID: "c8c66009-c5f6-417d-8250-90e3b514c5ab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.182707 4870 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.182740 4870 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c66009-c5f6-417d-8250-90e3b514c5ab-util\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.182752 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvppz\" (UniqueName: \"kubernetes.io/projected/c8c66009-c5f6-417d-8250-90e3b514c5ab-kube-api-access-fvppz\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.570362 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" event={"ID":"c8c66009-c5f6-417d-8250-90e3b514c5ab","Type":"ContainerDied","Data":"f21610d088e7f4bf1e97c80daec55e0eac517120af35fd657dc90cb25e061e05"} Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.570417 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f21610d088e7f4bf1e97c80daec55e0eac517120af35fd657dc90cb25e061e05" Mar 12 00:20:44 crc kubenswrapper[4870]: I0312 00:20:44.570433 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.750379 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp"] Mar 12 00:20:45 crc kubenswrapper[4870]: E0312 00:20:45.750664 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c66009-c5f6-417d-8250-90e3b514c5ab" containerName="pull" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.750683 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c66009-c5f6-417d-8250-90e3b514c5ab" containerName="pull" Mar 12 00:20:45 crc kubenswrapper[4870]: E0312 00:20:45.750703 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c66009-c5f6-417d-8250-90e3b514c5ab" containerName="util" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.750712 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c66009-c5f6-417d-8250-90e3b514c5ab" containerName="util" Mar 12 00:20:45 crc kubenswrapper[4870]: E0312 00:20:45.750726 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c66009-c5f6-417d-8250-90e3b514c5ab" containerName="extract" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.750733 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c66009-c5f6-417d-8250-90e3b514c5ab" containerName="extract" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.750843 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c66009-c5f6-417d-8250-90e3b514c5ab" containerName="extract" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.751707 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.756579 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.763809 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp"] Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.814961 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.815053 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jrnz\" (UniqueName: \"kubernetes.io/projected/812a958c-4d49-4c60-b2c8-34702f4ec92f-kube-api-access-8jrnz\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.815251 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.916116 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.916209 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.916240 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jrnz\" (UniqueName: \"kubernetes.io/projected/812a958c-4d49-4c60-b2c8-34702f4ec92f-kube-api-access-8jrnz\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.917092 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.917112 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:45 crc kubenswrapper[4870]: I0312 00:20:45.935716 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jrnz\" (UniqueName: \"kubernetes.io/projected/812a958c-4d49-4c60-b2c8-34702f4ec92f-kube-api-access-8jrnz\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:46 crc kubenswrapper[4870]: I0312 00:20:46.070868 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:46 crc kubenswrapper[4870]: I0312 00:20:46.323509 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp"] Mar 12 00:20:46 crc kubenswrapper[4870]: I0312 00:20:46.585019 4870 generic.go:334] "Generic (PLEG): container finished" podID="812a958c-4d49-4c60-b2c8-34702f4ec92f" containerID="804ee0e59fd4732f8396c3ea2e54e1b77586b5cd0f9373bc1687825350cea2f1" exitCode=0 Mar 12 00:20:46 crc kubenswrapper[4870]: I0312 00:20:46.585125 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" event={"ID":"812a958c-4d49-4c60-b2c8-34702f4ec92f","Type":"ContainerDied","Data":"804ee0e59fd4732f8396c3ea2e54e1b77586b5cd0f9373bc1687825350cea2f1"} Mar 12 00:20:46 crc kubenswrapper[4870]: I0312 00:20:46.585524 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" event={"ID":"812a958c-4d49-4c60-b2c8-34702f4ec92f","Type":"ContainerStarted","Data":"e599f1eff755ad7ac0438b49f42aa242c57991c3a504f396287dae89d77d44c3"} Mar 12 00:20:47 crc kubenswrapper[4870]: I0312 00:20:47.595044 4870 generic.go:334] "Generic (PLEG): container finished" podID="812a958c-4d49-4c60-b2c8-34702f4ec92f" containerID="c6665578475c5790f714998f9a7922e74a67fdb70107cdef715be3a0be43f414" exitCode=0 Mar 12 00:20:47 crc kubenswrapper[4870]: I0312 00:20:47.595309 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" event={"ID":"812a958c-4d49-4c60-b2c8-34702f4ec92f","Type":"ContainerDied","Data":"c6665578475c5790f714998f9a7922e74a67fdb70107cdef715be3a0be43f414"} Mar 12 00:20:48 crc kubenswrapper[4870]: I0312 00:20:48.604098 4870 generic.go:334] "Generic (PLEG): container finished" podID="812a958c-4d49-4c60-b2c8-34702f4ec92f" containerID="64cf418528e5daa23fef7d8dfc7ac1bebee079953148466b07840e4222f55191" exitCode=0 Mar 12 00:20:48 crc kubenswrapper[4870]: I0312 00:20:48.604271 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" event={"ID":"812a958c-4d49-4c60-b2c8-34702f4ec92f","Type":"ContainerDied","Data":"64cf418528e5daa23fef7d8dfc7ac1bebee079953148466b07840e4222f55191"} Mar 12 00:20:49 crc kubenswrapper[4870]: I0312 00:20:49.857823 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:49 crc kubenswrapper[4870]: I0312 00:20:49.869223 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-bundle\") pod \"812a958c-4d49-4c60-b2c8-34702f4ec92f\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " Mar 12 00:20:49 crc kubenswrapper[4870]: I0312 00:20:49.869263 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jrnz\" (UniqueName: \"kubernetes.io/projected/812a958c-4d49-4c60-b2c8-34702f4ec92f-kube-api-access-8jrnz\") pod \"812a958c-4d49-4c60-b2c8-34702f4ec92f\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " Mar 12 00:20:49 crc kubenswrapper[4870]: I0312 00:20:49.869320 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-util\") pod \"812a958c-4d49-4c60-b2c8-34702f4ec92f\" (UID: \"812a958c-4d49-4c60-b2c8-34702f4ec92f\") " Mar 12 00:20:49 crc kubenswrapper[4870]: I0312 00:20:49.870063 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-bundle" (OuterVolumeSpecName: "bundle") pod "812a958c-4d49-4c60-b2c8-34702f4ec92f" (UID: "812a958c-4d49-4c60-b2c8-34702f4ec92f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:20:49 crc kubenswrapper[4870]: I0312 00:20:49.876510 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812a958c-4d49-4c60-b2c8-34702f4ec92f-kube-api-access-8jrnz" (OuterVolumeSpecName: "kube-api-access-8jrnz") pod "812a958c-4d49-4c60-b2c8-34702f4ec92f" (UID: "812a958c-4d49-4c60-b2c8-34702f4ec92f"). InnerVolumeSpecName "kube-api-access-8jrnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:20:49 crc kubenswrapper[4870]: I0312 00:20:49.893018 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-util" (OuterVolumeSpecName: "util") pod "812a958c-4d49-4c60-b2c8-34702f4ec92f" (UID: "812a958c-4d49-4c60-b2c8-34702f4ec92f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:20:49 crc kubenswrapper[4870]: I0312 00:20:49.970555 4870 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:49 crc kubenswrapper[4870]: I0312 00:20:49.970596 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jrnz\" (UniqueName: \"kubernetes.io/projected/812a958c-4d49-4c60-b2c8-34702f4ec92f-kube-api-access-8jrnz\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:49 crc kubenswrapper[4870]: I0312 00:20:49.970612 4870 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/812a958c-4d49-4c60-b2c8-34702f4ec92f-util\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.123396 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l"] Mar 12 00:20:50 crc kubenswrapper[4870]: E0312 00:20:50.123643 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812a958c-4d49-4c60-b2c8-34702f4ec92f" containerName="util" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.123662 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="812a958c-4d49-4c60-b2c8-34702f4ec92f" containerName="util" Mar 12 00:20:50 crc kubenswrapper[4870]: E0312 00:20:50.123683 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812a958c-4d49-4c60-b2c8-34702f4ec92f" containerName="pull" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.123691 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="812a958c-4d49-4c60-b2c8-34702f4ec92f" containerName="pull" Mar 12 00:20:50 crc kubenswrapper[4870]: E0312 00:20:50.123705 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812a958c-4d49-4c60-b2c8-34702f4ec92f" containerName="extract" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.123713 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="812a958c-4d49-4c60-b2c8-34702f4ec92f" containerName="extract" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.123838 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="812a958c-4d49-4c60-b2c8-34702f4ec92f" containerName="extract" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.124737 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.138052 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l"] Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.172001 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.172072 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7xdx\" (UniqueName: \"kubernetes.io/projected/6e7a1769-d2e9-4230-9964-8c2b0162a247-kube-api-access-x7xdx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.172131 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.273759 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.273851 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7xdx\" (UniqueName: \"kubernetes.io/projected/6e7a1769-d2e9-4230-9964-8c2b0162a247-kube-api-access-x7xdx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.273911 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.274406 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.274677 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.319598 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7xdx\" (UniqueName: \"kubernetes.io/projected/6e7a1769-d2e9-4230-9964-8c2b0162a247-kube-api-access-x7xdx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.437486 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.621724 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" event={"ID":"812a958c-4d49-4c60-b2c8-34702f4ec92f","Type":"ContainerDied","Data":"e599f1eff755ad7ac0438b49f42aa242c57991c3a504f396287dae89d77d44c3"} Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.621775 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e599f1eff755ad7ac0438b49f42aa242c57991c3a504f396287dae89d77d44c3" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.621791 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp" Mar 12 00:20:50 crc kubenswrapper[4870]: I0312 00:20:50.668597 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l"] Mar 12 00:20:51 crc kubenswrapper[4870]: I0312 00:20:51.628072 4870 generic.go:334] "Generic (PLEG): container finished" podID="6e7a1769-d2e9-4230-9964-8c2b0162a247" containerID="fd735470b85b560911fba78ad960fa932b39e28677eca6b83578307fff641389" exitCode=0 Mar 12 00:20:51 crc kubenswrapper[4870]: I0312 00:20:51.628167 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" event={"ID":"6e7a1769-d2e9-4230-9964-8c2b0162a247","Type":"ContainerDied","Data":"fd735470b85b560911fba78ad960fa932b39e28677eca6b83578307fff641389"} Mar 12 00:20:51 crc kubenswrapper[4870]: I0312 00:20:51.628378 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" event={"ID":"6e7a1769-d2e9-4230-9964-8c2b0162a247","Type":"ContainerStarted","Data":"234cfc6620c6df413385cc0945eace962c7d9c32f9cbe2dbef32d249ab0a3c2a"} Mar 12 00:20:55 crc kubenswrapper[4870]: I0312 00:20:55.648397 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" event={"ID":"6e7a1769-d2e9-4230-9964-8c2b0162a247","Type":"ContainerStarted","Data":"c0c78d986946e1cb7aede75f0507b18fbb6e16fe69bb7ada9b76a314c9e267ab"} Mar 12 00:20:56 crc kubenswrapper[4870]: I0312 00:20:56.655401 4870 generic.go:334] "Generic (PLEG): container finished" podID="6e7a1769-d2e9-4230-9964-8c2b0162a247" containerID="c0c78d986946e1cb7aede75f0507b18fbb6e16fe69bb7ada9b76a314c9e267ab" exitCode=0 Mar 12 00:20:56 crc kubenswrapper[4870]: I0312 00:20:56.655464 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" event={"ID":"6e7a1769-d2e9-4230-9964-8c2b0162a247","Type":"ContainerDied","Data":"c0c78d986946e1cb7aede75f0507b18fbb6e16fe69bb7ada9b76a314c9e267ab"} Mar 12 00:20:57 crc kubenswrapper[4870]: I0312 00:20:57.665109 4870 generic.go:334] "Generic (PLEG): container finished" podID="6e7a1769-d2e9-4230-9964-8c2b0162a247" containerID="759ae9445ac25cf08cdde9f66b1ff790fe512406c60d02aea67751d4e39214c4" exitCode=0 Mar 12 00:20:57 crc kubenswrapper[4870]: I0312 00:20:57.665461 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" event={"ID":"6e7a1769-d2e9-4230-9964-8c2b0162a247","Type":"ContainerDied","Data":"759ae9445ac25cf08cdde9f66b1ff790fe512406c60d02aea67751d4e39214c4"} Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.489814 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2"] Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.490449 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.492107 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-zznf7" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.492694 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.494855 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.536892 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2"] Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.615067 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j"] Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.615916 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.618234 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.618415 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-p8wdl" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.623826 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z"] Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.624628 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.643848 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z"] Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.664360 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f44c8e76-1dd9-45f3-a81b-187476812817-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j\" (UID: \"f44c8e76-1dd9-45f3-a81b-187476812817\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.664414 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9b62e7c-69f6-4997-bdb4-f342b7ac04eb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z\" (UID: \"b9b62e7c-69f6-4997-bdb4-f342b7ac04eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.664464 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9b62e7c-69f6-4997-bdb4-f342b7ac04eb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z\" (UID: \"b9b62e7c-69f6-4997-bdb4-f342b7ac04eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.664483 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f44c8e76-1dd9-45f3-a81b-187476812817-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j\" (UID: \"f44c8e76-1dd9-45f3-a81b-187476812817\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.664510 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9cfj\" (UniqueName: \"kubernetes.io/projected/8ac96e15-a84c-4d36-a106-a5a786ecc075-kube-api-access-g9cfj\") pod \"obo-prometheus-operator-68bc856cb9-zttt2\" (UID: \"8ac96e15-a84c-4d36-a106-a5a786ecc075\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.685971 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j"] Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.765095 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9b62e7c-69f6-4997-bdb4-f342b7ac04eb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z\" (UID: \"b9b62e7c-69f6-4997-bdb4-f342b7ac04eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.765440 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f44c8e76-1dd9-45f3-a81b-187476812817-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j\" (UID: \"f44c8e76-1dd9-45f3-a81b-187476812817\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.765472 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9cfj\" (UniqueName: \"kubernetes.io/projected/8ac96e15-a84c-4d36-a106-a5a786ecc075-kube-api-access-g9cfj\") pod \"obo-prometheus-operator-68bc856cb9-zttt2\" (UID: \"8ac96e15-a84c-4d36-a106-a5a786ecc075\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.765506 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f44c8e76-1dd9-45f3-a81b-187476812817-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j\" (UID: \"f44c8e76-1dd9-45f3-a81b-187476812817\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.765524 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9b62e7c-69f6-4997-bdb4-f342b7ac04eb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z\" (UID: \"b9b62e7c-69f6-4997-bdb4-f342b7ac04eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.773984 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9b62e7c-69f6-4997-bdb4-f342b7ac04eb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z\" (UID: \"b9b62e7c-69f6-4997-bdb4-f342b7ac04eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.777712 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f44c8e76-1dd9-45f3-a81b-187476812817-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j\" (UID: \"f44c8e76-1dd9-45f3-a81b-187476812817\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.790866 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9b62e7c-69f6-4997-bdb4-f342b7ac04eb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z\" (UID: \"b9b62e7c-69f6-4997-bdb4-f342b7ac04eb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.791317 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f44c8e76-1dd9-45f3-a81b-187476812817-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j\" (UID: \"f44c8e76-1dd9-45f3-a81b-187476812817\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.791645 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9cfj\" (UniqueName: \"kubernetes.io/projected/8ac96e15-a84c-4d36-a106-a5a786ecc075-kube-api-access-g9cfj\") pod \"obo-prometheus-operator-68bc856cb9-zttt2\" (UID: \"8ac96e15-a84c-4d36-a106-a5a786ecc075\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.807415 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.855485 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-vnh6n"] Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.856076 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.859694 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-x7snj" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.859920 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.871470 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-vnh6n"] Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.939462 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.946263 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.967458 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vpnd\" (UniqueName: \"kubernetes.io/projected/a807a2a6-682f-4d87-816a-fe0a8ca96410-kube-api-access-4vpnd\") pod \"observability-operator-59bdc8b94-vnh6n\" (UID: \"a807a2a6-682f-4d87-816a-fe0a8ca96410\") " pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" Mar 12 00:20:58 crc kubenswrapper[4870]: I0312 00:20:58.967498 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a807a2a6-682f-4d87-816a-fe0a8ca96410-observability-operator-tls\") pod \"observability-operator-59bdc8b94-vnh6n\" (UID: \"a807a2a6-682f-4d87-816a-fe0a8ca96410\") " pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.019344 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tdzmm"] Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.020577 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.023372 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-hxn4v" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.034412 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tdzmm"] Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.049595 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.068095 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vpnd\" (UniqueName: \"kubernetes.io/projected/a807a2a6-682f-4d87-816a-fe0a8ca96410-kube-api-access-4vpnd\") pod \"observability-operator-59bdc8b94-vnh6n\" (UID: \"a807a2a6-682f-4d87-816a-fe0a8ca96410\") " pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.068157 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a807a2a6-682f-4d87-816a-fe0a8ca96410-observability-operator-tls\") pod \"observability-operator-59bdc8b94-vnh6n\" (UID: \"a807a2a6-682f-4d87-816a-fe0a8ca96410\") " pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.072905 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a807a2a6-682f-4d87-816a-fe0a8ca96410-observability-operator-tls\") pod \"observability-operator-59bdc8b94-vnh6n\" (UID: \"a807a2a6-682f-4d87-816a-fe0a8ca96410\") " pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.093673 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vpnd\" (UniqueName: \"kubernetes.io/projected/a807a2a6-682f-4d87-816a-fe0a8ca96410-kube-api-access-4vpnd\") pod \"observability-operator-59bdc8b94-vnh6n\" (UID: \"a807a2a6-682f-4d87-816a-fe0a8ca96410\") " pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.169966 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7xdx\" (UniqueName: \"kubernetes.io/projected/6e7a1769-d2e9-4230-9964-8c2b0162a247-kube-api-access-x7xdx\") pod \"6e7a1769-d2e9-4230-9964-8c2b0162a247\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.170029 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-util\") pod \"6e7a1769-d2e9-4230-9964-8c2b0162a247\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.170089 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-bundle\") pod \"6e7a1769-d2e9-4230-9964-8c2b0162a247\" (UID: \"6e7a1769-d2e9-4230-9964-8c2b0162a247\") " Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.170383 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5grr\" (UniqueName: \"kubernetes.io/projected/8a679de6-3b5b-4ac3-a070-201c270ec629-kube-api-access-j5grr\") pod \"perses-operator-5bf474d74f-tdzmm\" (UID: \"8a679de6-3b5b-4ac3-a070-201c270ec629\") " pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.170423 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a679de6-3b5b-4ac3-a070-201c270ec629-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tdzmm\" (UID: \"8a679de6-3b5b-4ac3-a070-201c270ec629\") " pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.173676 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e7a1769-d2e9-4230-9964-8c2b0162a247-kube-api-access-x7xdx" (OuterVolumeSpecName: "kube-api-access-x7xdx") pod "6e7a1769-d2e9-4230-9964-8c2b0162a247" (UID: "6e7a1769-d2e9-4230-9964-8c2b0162a247"). InnerVolumeSpecName "kube-api-access-x7xdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.178128 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-bundle" (OuterVolumeSpecName: "bundle") pod "6e7a1769-d2e9-4230-9964-8c2b0162a247" (UID: "6e7a1769-d2e9-4230-9964-8c2b0162a247"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.196578 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.198398 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-util" (OuterVolumeSpecName: "util") pod "6e7a1769-d2e9-4230-9964-8c2b0162a247" (UID: "6e7a1769-d2e9-4230-9964-8c2b0162a247"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.242231 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j"] Mar 12 00:20:59 crc kubenswrapper[4870]: W0312 00:20:59.266790 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf44c8e76_1dd9_45f3_a81b_187476812817.slice/crio-63a1a0c9e581f13bbd4f8f20ea22ffd8472c943de5ad127a68042bf61d26edd2 WatchSource:0}: Error finding container 63a1a0c9e581f13bbd4f8f20ea22ffd8472c943de5ad127a68042bf61d26edd2: Status 404 returned error can't find the container with id 63a1a0c9e581f13bbd4f8f20ea22ffd8472c943de5ad127a68042bf61d26edd2 Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.271711 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5grr\" (UniqueName: \"kubernetes.io/projected/8a679de6-3b5b-4ac3-a070-201c270ec629-kube-api-access-j5grr\") pod \"perses-operator-5bf474d74f-tdzmm\" (UID: \"8a679de6-3b5b-4ac3-a070-201c270ec629\") " pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.271764 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a679de6-3b5b-4ac3-a070-201c270ec629-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tdzmm\" (UID: \"8a679de6-3b5b-4ac3-a070-201c270ec629\") " pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.271823 4870 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.271833 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7xdx\" (UniqueName: \"kubernetes.io/projected/6e7a1769-d2e9-4230-9964-8c2b0162a247-kube-api-access-x7xdx\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.271842 4870 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e7a1769-d2e9-4230-9964-8c2b0162a247-util\") on node \"crc\" DevicePath \"\"" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.272574 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a679de6-3b5b-4ac3-a070-201c270ec629-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tdzmm\" (UID: \"8a679de6-3b5b-4ac3-a070-201c270ec629\") " pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.295328 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5grr\" (UniqueName: \"kubernetes.io/projected/8a679de6-3b5b-4ac3-a070-201c270ec629-kube-api-access-j5grr\") pod \"perses-operator-5bf474d74f-tdzmm\" (UID: \"8a679de6-3b5b-4ac3-a070-201c270ec629\") " pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.309677 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2"] Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.314635 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z"] Mar 12 00:20:59 crc kubenswrapper[4870]: W0312 00:20:59.325943 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9b62e7c_69f6_4997_bdb4_f342b7ac04eb.slice/crio-0a5d01717fc234ad4ca0f6ef7320002666757421abf6cffb361821a921a02657 WatchSource:0}: Error finding container 0a5d01717fc234ad4ca0f6ef7320002666757421abf6cffb361821a921a02657: Status 404 returned error can't find the container with id 0a5d01717fc234ad4ca0f6ef7320002666757421abf6cffb361821a921a02657 Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.349840 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.637055 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tdzmm"] Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.695288 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" event={"ID":"8a679de6-3b5b-4ac3-a070-201c270ec629","Type":"ContainerStarted","Data":"3096088e122645991970b17bc4d224912a4957b48d00e7d044333102297c4acf"} Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.714628 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-vnh6n"] Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.717175 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2" event={"ID":"8ac96e15-a84c-4d36-a106-a5a786ecc075","Type":"ContainerStarted","Data":"1dbef724ff00f872ce12771d9385bcd002342caeac03d2016afcf5c98df3b414"} Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.737718 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.737886 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l" event={"ID":"6e7a1769-d2e9-4230-9964-8c2b0162a247","Type":"ContainerDied","Data":"234cfc6620c6df413385cc0945eace962c7d9c32f9cbe2dbef32d249ab0a3c2a"} Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.737927 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="234cfc6620c6df413385cc0945eace962c7d9c32f9cbe2dbef32d249ab0a3c2a" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.746575 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" event={"ID":"b9b62e7c-69f6-4997-bdb4-f342b7ac04eb","Type":"ContainerStarted","Data":"0a5d01717fc234ad4ca0f6ef7320002666757421abf6cffb361821a921a02657"} Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.748123 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" event={"ID":"f44c8e76-1dd9-45f3-a81b-187476812817","Type":"ContainerStarted","Data":"63a1a0c9e581f13bbd4f8f20ea22ffd8472c943de5ad127a68042bf61d26edd2"} Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.774765 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-6db4dfbc56-m8wr9"] Mar 12 00:20:59 crc kubenswrapper[4870]: E0312 00:20:59.774962 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7a1769-d2e9-4230-9964-8c2b0162a247" containerName="util" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.774977 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7a1769-d2e9-4230-9964-8c2b0162a247" containerName="util" Mar 12 00:20:59 crc kubenswrapper[4870]: E0312 00:20:59.774986 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7a1769-d2e9-4230-9964-8c2b0162a247" containerName="extract" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.774993 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7a1769-d2e9-4230-9964-8c2b0162a247" containerName="extract" Mar 12 00:20:59 crc kubenswrapper[4870]: E0312 00:20:59.775002 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7a1769-d2e9-4230-9964-8c2b0162a247" containerName="pull" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.775008 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7a1769-d2e9-4230-9964-8c2b0162a247" containerName="pull" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.775101 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7a1769-d2e9-4230-9964-8c2b0162a247" containerName="extract" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.775522 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.779705 4870 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.779961 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.780095 4870 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-t4rql" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.780228 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.793407 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-6db4dfbc56-m8wr9"] Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.882820 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10f09f34-2be7-42d5-9efd-f47f69c63519-webhook-cert\") pod \"elastic-operator-6db4dfbc56-m8wr9\" (UID: \"10f09f34-2be7-42d5-9efd-f47f69c63519\") " pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.882890 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trjzx\" (UniqueName: \"kubernetes.io/projected/10f09f34-2be7-42d5-9efd-f47f69c63519-kube-api-access-trjzx\") pod \"elastic-operator-6db4dfbc56-m8wr9\" (UID: \"10f09f34-2be7-42d5-9efd-f47f69c63519\") " pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.882947 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10f09f34-2be7-42d5-9efd-f47f69c63519-apiservice-cert\") pod \"elastic-operator-6db4dfbc56-m8wr9\" (UID: \"10f09f34-2be7-42d5-9efd-f47f69c63519\") " pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.984833 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10f09f34-2be7-42d5-9efd-f47f69c63519-apiservice-cert\") pod \"elastic-operator-6db4dfbc56-m8wr9\" (UID: \"10f09f34-2be7-42d5-9efd-f47f69c63519\") " pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.984932 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10f09f34-2be7-42d5-9efd-f47f69c63519-webhook-cert\") pod \"elastic-operator-6db4dfbc56-m8wr9\" (UID: \"10f09f34-2be7-42d5-9efd-f47f69c63519\") " pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.984981 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trjzx\" (UniqueName: \"kubernetes.io/projected/10f09f34-2be7-42d5-9efd-f47f69c63519-kube-api-access-trjzx\") pod \"elastic-operator-6db4dfbc56-m8wr9\" (UID: \"10f09f34-2be7-42d5-9efd-f47f69c63519\") " pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.993130 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10f09f34-2be7-42d5-9efd-f47f69c63519-webhook-cert\") pod \"elastic-operator-6db4dfbc56-m8wr9\" (UID: \"10f09f34-2be7-42d5-9efd-f47f69c63519\") " pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.993586 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10f09f34-2be7-42d5-9efd-f47f69c63519-apiservice-cert\") pod \"elastic-operator-6db4dfbc56-m8wr9\" (UID: \"10f09f34-2be7-42d5-9efd-f47f69c63519\") " pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:20:59 crc kubenswrapper[4870]: I0312 00:20:59.999766 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trjzx\" (UniqueName: \"kubernetes.io/projected/10f09f34-2be7-42d5-9efd-f47f69c63519-kube-api-access-trjzx\") pod \"elastic-operator-6db4dfbc56-m8wr9\" (UID: \"10f09f34-2be7-42d5-9efd-f47f69c63519\") " pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:21:00 crc kubenswrapper[4870]: I0312 00:21:00.134377 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" Mar 12 00:21:00 crc kubenswrapper[4870]: I0312 00:21:00.356543 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-6db4dfbc56-m8wr9"] Mar 12 00:21:00 crc kubenswrapper[4870]: W0312 00:21:00.360201 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10f09f34_2be7_42d5_9efd_f47f69c63519.slice/crio-1881921f9b3d22800057e38c14b473561960cf80436c7013bfbef801b45aff4e WatchSource:0}: Error finding container 1881921f9b3d22800057e38c14b473561960cf80436c7013bfbef801b45aff4e: Status 404 returned error can't find the container with id 1881921f9b3d22800057e38c14b473561960cf80436c7013bfbef801b45aff4e Mar 12 00:21:00 crc kubenswrapper[4870]: I0312 00:21:00.755922 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" event={"ID":"10f09f34-2be7-42d5-9efd-f47f69c63519","Type":"ContainerStarted","Data":"1881921f9b3d22800057e38c14b473561960cf80436c7013bfbef801b45aff4e"} Mar 12 00:21:00 crc kubenswrapper[4870]: I0312 00:21:00.756828 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" event={"ID":"a807a2a6-682f-4d87-816a-fe0a8ca96410","Type":"ContainerStarted","Data":"f338289cbd0591d2f2e284e1ae6c2abb5a740f92e36dd1d3412240a4d63575e4"} Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.823116 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" event={"ID":"b9b62e7c-69f6-4997-bdb4-f342b7ac04eb","Type":"ContainerStarted","Data":"4d133b9d71664ede3ff8130d62ec08acb9429acee5e2b9dc6371df0273dc1641"} Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.827255 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" event={"ID":"f44c8e76-1dd9-45f3-a81b-187476812817","Type":"ContainerStarted","Data":"d06a4d65fc4c4b42143676c3d9d085d572122c338bfc6ae435be8c37391d8f7a"} Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.831014 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" event={"ID":"10f09f34-2be7-42d5-9efd-f47f69c63519","Type":"ContainerStarted","Data":"a0e37e1929c92d1009318638e85430fcb7d2ba09e74938c7061d9f36251da0f5"} Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.836760 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" event={"ID":"a807a2a6-682f-4d87-816a-fe0a8ca96410","Type":"ContainerStarted","Data":"1bb787effb289043a689bd23f84e653100f7a82c07be3f52454531224ba2e1f9"} Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.836942 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.837984 4870 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-vnh6n container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.41:8081/healthz\": dial tcp 10.217.0.41:8081: connect: connection refused" start-of-body= Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.838029 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" podUID="a807a2a6-682f-4d87-816a-fe0a8ca96410" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.41:8081/healthz\": dial tcp 10.217.0.41:8081: connect: connection refused" Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.839402 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" event={"ID":"8a679de6-3b5b-4ac3-a070-201c270ec629","Type":"ContainerStarted","Data":"d28213b063acccbd39087a429ef5d9991f2807bb02b4cee9026ad7f6ad253c2c"} Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.839538 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.844544 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z" podStartSLOduration=1.626850723 podStartE2EDuration="12.844526273s" podCreationTimestamp="2026-03-12 00:20:58 +0000 UTC" firstStartedPulling="2026-03-12 00:20:59.327772645 +0000 UTC m=+749.931188955" lastFinishedPulling="2026-03-12 00:21:10.545448185 +0000 UTC m=+761.148864505" observedRunningTime="2026-03-12 00:21:10.840488343 +0000 UTC m=+761.443904653" watchObservedRunningTime="2026-03-12 00:21:10.844526273 +0000 UTC m=+761.447942583" Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.865107 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-6db4dfbc56-m8wr9" podStartSLOduration=1.769344946 podStartE2EDuration="11.865093425s" podCreationTimestamp="2026-03-12 00:20:59 +0000 UTC" firstStartedPulling="2026-03-12 00:21:00.363586427 +0000 UTC m=+750.967002737" lastFinishedPulling="2026-03-12 00:21:10.459334906 +0000 UTC m=+761.062751216" observedRunningTime="2026-03-12 00:21:10.864768165 +0000 UTC m=+761.468184475" watchObservedRunningTime="2026-03-12 00:21:10.865093425 +0000 UTC m=+761.468509735" Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.894720 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j" podStartSLOduration=1.6325048720000002 podStartE2EDuration="12.894706435s" podCreationTimestamp="2026-03-12 00:20:58 +0000 UTC" firstStartedPulling="2026-03-12 00:20:59.274833442 +0000 UTC m=+749.878249752" lastFinishedPulling="2026-03-12 00:21:10.537035005 +0000 UTC m=+761.140451315" observedRunningTime="2026-03-12 00:21:10.890301164 +0000 UTC m=+761.493717474" watchObservedRunningTime="2026-03-12 00:21:10.894706435 +0000 UTC m=+761.498122745" Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.921364 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" podStartSLOduration=2.088410029 podStartE2EDuration="12.921348616s" podCreationTimestamp="2026-03-12 00:20:58 +0000 UTC" firstStartedPulling="2026-03-12 00:20:59.742283643 +0000 UTC m=+750.345699953" lastFinishedPulling="2026-03-12 00:21:10.57522224 +0000 UTC m=+761.178638540" observedRunningTime="2026-03-12 00:21:10.918390848 +0000 UTC m=+761.521807158" watchObservedRunningTime="2026-03-12 00:21:10.921348616 +0000 UTC m=+761.524764926" Mar 12 00:21:10 crc kubenswrapper[4870]: I0312 00:21:10.942750 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" podStartSLOduration=1.077602768 podStartE2EDuration="11.942733272s" podCreationTimestamp="2026-03-12 00:20:59 +0000 UTC" firstStartedPulling="2026-03-12 00:20:59.663680707 +0000 UTC m=+750.267097017" lastFinishedPulling="2026-03-12 00:21:10.528811211 +0000 UTC m=+761.132227521" observedRunningTime="2026-03-12 00:21:10.93662853 +0000 UTC m=+761.540044840" watchObservedRunningTime="2026-03-12 00:21:10.942733272 +0000 UTC m=+761.546149582" Mar 12 00:21:11 crc kubenswrapper[4870]: I0312 00:21:11.846489 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2" event={"ID":"8ac96e15-a84c-4d36-a106-a5a786ecc075","Type":"ContainerStarted","Data":"5e1df6f033b15e27988bb9d8910c15d8ea069daba900f837f035bb57fdf5c5e3"} Mar 12 00:21:11 crc kubenswrapper[4870]: I0312 00:21:11.848789 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-vnh6n" Mar 12 00:21:11 crc kubenswrapper[4870]: I0312 00:21:11.863431 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zttt2" podStartSLOduration=2.661116519 podStartE2EDuration="13.863408462s" podCreationTimestamp="2026-03-12 00:20:58 +0000 UTC" firstStartedPulling="2026-03-12 00:20:59.354688135 +0000 UTC m=+749.958104445" lastFinishedPulling="2026-03-12 00:21:10.556980068 +0000 UTC m=+761.160396388" observedRunningTime="2026-03-12 00:21:11.86300886 +0000 UTC m=+762.466425180" watchObservedRunningTime="2026-03-12 00:21:11.863408462 +0000 UTC m=+762.466824782" Mar 12 00:21:13 crc kubenswrapper[4870]: I0312 00:21:13.735325 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8"] Mar 12 00:21:13 crc kubenswrapper[4870]: I0312 00:21:13.736025 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" Mar 12 00:21:13 crc kubenswrapper[4870]: I0312 00:21:13.739359 4870 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-8w9qp" Mar 12 00:21:13 crc kubenswrapper[4870]: I0312 00:21:13.739454 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 12 00:21:13 crc kubenswrapper[4870]: I0312 00:21:13.739712 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 12 00:21:13 crc kubenswrapper[4870]: I0312 00:21:13.747538 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8"] Mar 12 00:21:13 crc kubenswrapper[4870]: I0312 00:21:13.912775 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74f6198b-439b-44c2-a03a-37e7afbafd7d-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-hhqw8\" (UID: \"74f6198b-439b-44c2-a03a-37e7afbafd7d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" Mar 12 00:21:13 crc kubenswrapper[4870]: I0312 00:21:13.912822 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qws7\" (UniqueName: \"kubernetes.io/projected/74f6198b-439b-44c2-a03a-37e7afbafd7d-kube-api-access-7qws7\") pod \"cert-manager-operator-controller-manager-5586865c96-hhqw8\" (UID: \"74f6198b-439b-44c2-a03a-37e7afbafd7d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" Mar 12 00:21:14 crc kubenswrapper[4870]: I0312 00:21:14.014337 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74f6198b-439b-44c2-a03a-37e7afbafd7d-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-hhqw8\" (UID: \"74f6198b-439b-44c2-a03a-37e7afbafd7d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" Mar 12 00:21:14 crc kubenswrapper[4870]: I0312 00:21:14.014395 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qws7\" (UniqueName: \"kubernetes.io/projected/74f6198b-439b-44c2-a03a-37e7afbafd7d-kube-api-access-7qws7\") pod \"cert-manager-operator-controller-manager-5586865c96-hhqw8\" (UID: \"74f6198b-439b-44c2-a03a-37e7afbafd7d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" Mar 12 00:21:14 crc kubenswrapper[4870]: I0312 00:21:14.014793 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74f6198b-439b-44c2-a03a-37e7afbafd7d-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-hhqw8\" (UID: \"74f6198b-439b-44c2-a03a-37e7afbafd7d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" Mar 12 00:21:14 crc kubenswrapper[4870]: I0312 00:21:14.059186 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qws7\" (UniqueName: \"kubernetes.io/projected/74f6198b-439b-44c2-a03a-37e7afbafd7d-kube-api-access-7qws7\") pod \"cert-manager-operator-controller-manager-5586865c96-hhqw8\" (UID: \"74f6198b-439b-44c2-a03a-37e7afbafd7d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" Mar 12 00:21:14 crc kubenswrapper[4870]: I0312 00:21:14.349595 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" Mar 12 00:21:14 crc kubenswrapper[4870]: I0312 00:21:14.948626 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8"] Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.140465 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.141791 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.144524 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.144627 4870 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-btnd7" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.145039 4870 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.145045 4870 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.145218 4870 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.145513 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.145684 4870 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.145818 4870 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.146571 4870 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.164261 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.337556 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.337630 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.337665 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.337693 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.337833 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.337892 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.337923 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.337952 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.337996 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.338056 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.338092 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/3e620875-83c0-4493-83df-9301659b5148-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.338119 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.338214 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.338284 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.338315 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439058 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439104 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439165 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/3e620875-83c0-4493-83df-9301659b5148-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439183 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439211 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439237 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439255 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439273 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439302 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439319 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439341 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439363 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439382 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439399 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439415 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439676 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439733 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439754 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439819 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439912 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.439958 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/3e620875-83c0-4493-83df-9301659b5148-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.440444 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.440984 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.445347 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/3e620875-83c0-4493-83df-9301659b5148-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.446778 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.446817 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.446855 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.446852 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.450051 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.459758 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/3e620875-83c0-4493-83df-9301659b5148-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"3e620875-83c0-4493-83df-9301659b5148\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.759887 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.870133 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" event={"ID":"74f6198b-439b-44c2-a03a-37e7afbafd7d","Type":"ContainerStarted","Data":"abf6be1254d592a7df3102d6e7ecb0bb3c993c8fcf17f5b8f654095ed76f3f87"} Mar 12 00:21:15 crc kubenswrapper[4870]: I0312 00:21:15.968185 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 12 00:21:16 crc kubenswrapper[4870]: I0312 00:21:16.883216 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"3e620875-83c0-4493-83df-9301659b5148","Type":"ContainerStarted","Data":"0afaff47ba533ef7949aea3ed473c5b58d7d1c359ef33bb15ffcf485a7581c61"} Mar 12 00:21:17 crc kubenswrapper[4870]: I0312 00:21:17.594676 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:21:17 crc kubenswrapper[4870]: I0312 00:21:17.594732 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:21:19 crc kubenswrapper[4870]: I0312 00:21:19.354159 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-tdzmm" Mar 12 00:21:22 crc kubenswrapper[4870]: I0312 00:21:22.925565 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" event={"ID":"74f6198b-439b-44c2-a03a-37e7afbafd7d","Type":"ContainerStarted","Data":"4a8636c4bf97a4edf28bf979a2bf45e3d694a1dce01640d49a279a981108024b"} Mar 12 00:21:22 crc kubenswrapper[4870]: I0312 00:21:22.954225 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-hhqw8" podStartSLOduration=2.396443356 podStartE2EDuration="9.954209716s" podCreationTimestamp="2026-03-12 00:21:13 +0000 UTC" firstStartedPulling="2026-03-12 00:21:14.960681276 +0000 UTC m=+765.564097586" lastFinishedPulling="2026-03-12 00:21:22.518447636 +0000 UTC m=+773.121863946" observedRunningTime="2026-03-12 00:21:22.949874047 +0000 UTC m=+773.553290367" watchObservedRunningTime="2026-03-12 00:21:22.954209716 +0000 UTC m=+773.557626026" Mar 12 00:21:26 crc kubenswrapper[4870]: I0312 00:21:26.792720 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-rw8vl"] Mar 12 00:21:26 crc kubenswrapper[4870]: I0312 00:21:26.793955 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" Mar 12 00:21:26 crc kubenswrapper[4870]: I0312 00:21:26.796939 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 12 00:21:26 crc kubenswrapper[4870]: I0312 00:21:26.798880 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 12 00:21:26 crc kubenswrapper[4870]: I0312 00:21:26.805348 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-rw8vl"] Mar 12 00:21:26 crc kubenswrapper[4870]: I0312 00:21:26.805446 4870 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-lpffv" Mar 12 00:21:26 crc kubenswrapper[4870]: I0312 00:21:26.895568 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95c7f5b8-1aad-4f77-8503-20afcfa932f0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-rw8vl\" (UID: \"95c7f5b8-1aad-4f77-8503-20afcfa932f0\") " pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" Mar 12 00:21:26 crc kubenswrapper[4870]: I0312 00:21:26.895631 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlr9n\" (UniqueName: \"kubernetes.io/projected/95c7f5b8-1aad-4f77-8503-20afcfa932f0-kube-api-access-wlr9n\") pod \"cert-manager-webhook-6888856db4-rw8vl\" (UID: \"95c7f5b8-1aad-4f77-8503-20afcfa932f0\") " pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" Mar 12 00:21:26 crc kubenswrapper[4870]: I0312 00:21:26.996827 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95c7f5b8-1aad-4f77-8503-20afcfa932f0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-rw8vl\" (UID: \"95c7f5b8-1aad-4f77-8503-20afcfa932f0\") " pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" Mar 12 00:21:26 crc kubenswrapper[4870]: I0312 00:21:26.996876 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlr9n\" (UniqueName: \"kubernetes.io/projected/95c7f5b8-1aad-4f77-8503-20afcfa932f0-kube-api-access-wlr9n\") pod \"cert-manager-webhook-6888856db4-rw8vl\" (UID: \"95c7f5b8-1aad-4f77-8503-20afcfa932f0\") " pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" Mar 12 00:21:27 crc kubenswrapper[4870]: I0312 00:21:27.025330 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlr9n\" (UniqueName: \"kubernetes.io/projected/95c7f5b8-1aad-4f77-8503-20afcfa932f0-kube-api-access-wlr9n\") pod \"cert-manager-webhook-6888856db4-rw8vl\" (UID: \"95c7f5b8-1aad-4f77-8503-20afcfa932f0\") " pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" Mar 12 00:21:27 crc kubenswrapper[4870]: I0312 00:21:27.028556 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95c7f5b8-1aad-4f77-8503-20afcfa932f0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-rw8vl\" (UID: \"95c7f5b8-1aad-4f77-8503-20afcfa932f0\") " pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" Mar 12 00:21:27 crc kubenswrapper[4870]: I0312 00:21:27.114916 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" Mar 12 00:21:29 crc kubenswrapper[4870]: I0312 00:21:29.664827 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-h8d8r"] Mar 12 00:21:29 crc kubenswrapper[4870]: I0312 00:21:29.666357 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" Mar 12 00:21:29 crc kubenswrapper[4870]: I0312 00:21:29.669294 4870 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2d8cl" Mar 12 00:21:29 crc kubenswrapper[4870]: I0312 00:21:29.678464 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-h8d8r"] Mar 12 00:21:29 crc kubenswrapper[4870]: I0312 00:21:29.831239 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e3bc318-2b39-431b-8de9-7479fe510c00-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-h8d8r\" (UID: \"7e3bc318-2b39-431b-8de9-7479fe510c00\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" Mar 12 00:21:29 crc kubenswrapper[4870]: I0312 00:21:29.831591 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2qgg\" (UniqueName: \"kubernetes.io/projected/7e3bc318-2b39-431b-8de9-7479fe510c00-kube-api-access-q2qgg\") pod \"cert-manager-cainjector-5545bd876-h8d8r\" (UID: \"7e3bc318-2b39-431b-8de9-7479fe510c00\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" Mar 12 00:21:29 crc kubenswrapper[4870]: I0312 00:21:29.932335 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e3bc318-2b39-431b-8de9-7479fe510c00-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-h8d8r\" (UID: \"7e3bc318-2b39-431b-8de9-7479fe510c00\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" Mar 12 00:21:29 crc kubenswrapper[4870]: I0312 00:21:29.932396 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2qgg\" (UniqueName: \"kubernetes.io/projected/7e3bc318-2b39-431b-8de9-7479fe510c00-kube-api-access-q2qgg\") pod \"cert-manager-cainjector-5545bd876-h8d8r\" (UID: \"7e3bc318-2b39-431b-8de9-7479fe510c00\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" Mar 12 00:21:29 crc kubenswrapper[4870]: I0312 00:21:29.955809 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e3bc318-2b39-431b-8de9-7479fe510c00-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-h8d8r\" (UID: \"7e3bc318-2b39-431b-8de9-7479fe510c00\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" Mar 12 00:21:29 crc kubenswrapper[4870]: I0312 00:21:29.958022 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2qgg\" (UniqueName: \"kubernetes.io/projected/7e3bc318-2b39-431b-8de9-7479fe510c00-kube-api-access-q2qgg\") pod \"cert-manager-cainjector-5545bd876-h8d8r\" (UID: \"7e3bc318-2b39-431b-8de9-7479fe510c00\") " pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" Mar 12 00:21:30 crc kubenswrapper[4870]: I0312 00:21:30.001666 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" Mar 12 00:21:30 crc kubenswrapper[4870]: I0312 00:21:30.689795 4870 scope.go:117] "RemoveContainer" containerID="f75292cc45e88bcd878dab5d9e8ab5a87bd7913266e53492c95c49e82186f0cd" Mar 12 00:21:32 crc kubenswrapper[4870]: I0312 00:21:32.408979 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-h8d8r"] Mar 12 00:21:32 crc kubenswrapper[4870]: I0312 00:21:32.430830 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-rw8vl"] Mar 12 00:21:32 crc kubenswrapper[4870]: W0312 00:21:32.443216 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c7f5b8_1aad_4f77_8503_20afcfa932f0.slice/crio-b312a5cdfd09b38a97db7edbcfca008c634fc00fb27a5e19532cf6f4f2958bee WatchSource:0}: Error finding container b312a5cdfd09b38a97db7edbcfca008c634fc00fb27a5e19532cf6f4f2958bee: Status 404 returned error can't find the container with id b312a5cdfd09b38a97db7edbcfca008c634fc00fb27a5e19532cf6f4f2958bee Mar 12 00:21:33 crc kubenswrapper[4870]: I0312 00:21:33.016133 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"3e620875-83c0-4493-83df-9301659b5148","Type":"ContainerStarted","Data":"e5dc8c4b96737f020bc4affee329ac1eb934e65d233e865e381044a1dea981bd"} Mar 12 00:21:33 crc kubenswrapper[4870]: I0312 00:21:33.018018 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" event={"ID":"95c7f5b8-1aad-4f77-8503-20afcfa932f0","Type":"ContainerStarted","Data":"b312a5cdfd09b38a97db7edbcfca008c634fc00fb27a5e19532cf6f4f2958bee"} Mar 12 00:21:33 crc kubenswrapper[4870]: I0312 00:21:33.018984 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" event={"ID":"7e3bc318-2b39-431b-8de9-7479fe510c00","Type":"ContainerStarted","Data":"13bd5e22692eb2e558d19a9b9318a988a34ce5de7bb9a09aebf863e4037f3f84"} Mar 12 00:21:33 crc kubenswrapper[4870]: I0312 00:21:33.228253 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 12 00:21:33 crc kubenswrapper[4870]: I0312 00:21:33.257562 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 12 00:21:34 crc kubenswrapper[4870]: I0312 00:21:34.030206 4870 generic.go:334] "Generic (PLEG): container finished" podID="3e620875-83c0-4493-83df-9301659b5148" containerID="e5dc8c4b96737f020bc4affee329ac1eb934e65d233e865e381044a1dea981bd" exitCode=0 Mar 12 00:21:34 crc kubenswrapper[4870]: I0312 00:21:34.030261 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"3e620875-83c0-4493-83df-9301659b5148","Type":"ContainerDied","Data":"e5dc8c4b96737f020bc4affee329ac1eb934e65d233e865e381044a1dea981bd"} Mar 12 00:21:37 crc kubenswrapper[4870]: I0312 00:21:37.758594 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-g6xml"] Mar 12 00:21:37 crc kubenswrapper[4870]: I0312 00:21:37.760536 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-g6xml" Mar 12 00:21:37 crc kubenswrapper[4870]: I0312 00:21:37.768494 4870 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-h9gx6" Mar 12 00:21:37 crc kubenswrapper[4870]: I0312 00:21:37.780098 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-g6xml"] Mar 12 00:21:37 crc kubenswrapper[4870]: I0312 00:21:37.853947 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e826b4ba-aa17-464b-ae23-5d9cce47be51-bound-sa-token\") pod \"cert-manager-545d4d4674-g6xml\" (UID: \"e826b4ba-aa17-464b-ae23-5d9cce47be51\") " pod="cert-manager/cert-manager-545d4d4674-g6xml" Mar 12 00:21:37 crc kubenswrapper[4870]: I0312 00:21:37.854013 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4brkd\" (UniqueName: \"kubernetes.io/projected/e826b4ba-aa17-464b-ae23-5d9cce47be51-kube-api-access-4brkd\") pod \"cert-manager-545d4d4674-g6xml\" (UID: \"e826b4ba-aa17-464b-ae23-5d9cce47be51\") " pod="cert-manager/cert-manager-545d4d4674-g6xml" Mar 12 00:21:37 crc kubenswrapper[4870]: I0312 00:21:37.954736 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4brkd\" (UniqueName: \"kubernetes.io/projected/e826b4ba-aa17-464b-ae23-5d9cce47be51-kube-api-access-4brkd\") pod \"cert-manager-545d4d4674-g6xml\" (UID: \"e826b4ba-aa17-464b-ae23-5d9cce47be51\") " pod="cert-manager/cert-manager-545d4d4674-g6xml" Mar 12 00:21:37 crc kubenswrapper[4870]: I0312 00:21:37.954875 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e826b4ba-aa17-464b-ae23-5d9cce47be51-bound-sa-token\") pod \"cert-manager-545d4d4674-g6xml\" (UID: \"e826b4ba-aa17-464b-ae23-5d9cce47be51\") " pod="cert-manager/cert-manager-545d4d4674-g6xml" Mar 12 00:21:37 crc kubenswrapper[4870]: I0312 00:21:37.974701 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e826b4ba-aa17-464b-ae23-5d9cce47be51-bound-sa-token\") pod \"cert-manager-545d4d4674-g6xml\" (UID: \"e826b4ba-aa17-464b-ae23-5d9cce47be51\") " pod="cert-manager/cert-manager-545d4d4674-g6xml" Mar 12 00:21:37 crc kubenswrapper[4870]: I0312 00:21:37.975328 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4brkd\" (UniqueName: \"kubernetes.io/projected/e826b4ba-aa17-464b-ae23-5d9cce47be51-kube-api-access-4brkd\") pod \"cert-manager-545d4d4674-g6xml\" (UID: \"e826b4ba-aa17-464b-ae23-5d9cce47be51\") " pod="cert-manager/cert-manager-545d4d4674-g6xml" Mar 12 00:21:38 crc kubenswrapper[4870]: I0312 00:21:38.081063 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-g6xml" Mar 12 00:21:38 crc kubenswrapper[4870]: I0312 00:21:38.488439 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-g6xml"] Mar 12 00:21:38 crc kubenswrapper[4870]: W0312 00:21:38.495467 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode826b4ba_aa17_464b_ae23_5d9cce47be51.slice/crio-97cb1933fe2780e79ae9c3f4e0397bc35a120dc962f4ae772c8cc1f3c907ae3b WatchSource:0}: Error finding container 97cb1933fe2780e79ae9c3f4e0397bc35a120dc962f4ae772c8cc1f3c907ae3b: Status 404 returned error can't find the container with id 97cb1933fe2780e79ae9c3f4e0397bc35a120dc962f4ae772c8cc1f3c907ae3b Mar 12 00:21:39 crc kubenswrapper[4870]: I0312 00:21:39.055920 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-g6xml" event={"ID":"e826b4ba-aa17-464b-ae23-5d9cce47be51","Type":"ContainerStarted","Data":"97cb1933fe2780e79ae9c3f4e0397bc35a120dc962f4ae772c8cc1f3c907ae3b"} Mar 12 00:21:40 crc kubenswrapper[4870]: I0312 00:21:40.065960 4870 generic.go:334] "Generic (PLEG): container finished" podID="3e620875-83c0-4493-83df-9301659b5148" containerID="0c35918a1b3699c05ec96aa013abe2b9c0970ba789716c9112d765f9a5f09922" exitCode=0 Mar 12 00:21:40 crc kubenswrapper[4870]: I0312 00:21:40.066064 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"3e620875-83c0-4493-83df-9301659b5148","Type":"ContainerDied","Data":"0c35918a1b3699c05ec96aa013abe2b9c0970ba789716c9112d765f9a5f09922"} Mar 12 00:21:42 crc kubenswrapper[4870]: I0312 00:21:42.078957 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" event={"ID":"7e3bc318-2b39-431b-8de9-7479fe510c00","Type":"ContainerStarted","Data":"f1e99ae7914a3494d254d12664bc578b073aa89385900160f3d38875cb184907"} Mar 12 00:21:42 crc kubenswrapper[4870]: I0312 00:21:42.082273 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"3e620875-83c0-4493-83df-9301659b5148","Type":"ContainerStarted","Data":"632ddaa4c4fbf933d467ccd36f2462ac8fe6f8d1cdfa4548531ec715f8657d88"} Mar 12 00:21:42 crc kubenswrapper[4870]: I0312 00:21:42.082425 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:21:42 crc kubenswrapper[4870]: I0312 00:21:42.083715 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-g6xml" event={"ID":"e826b4ba-aa17-464b-ae23-5d9cce47be51","Type":"ContainerStarted","Data":"ab143a4bbf6fbcd3370039a83efa0b2e941080caf969bab92cfc2db2e4e2994a"} Mar 12 00:21:42 crc kubenswrapper[4870]: I0312 00:21:42.084925 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" event={"ID":"95c7f5b8-1aad-4f77-8503-20afcfa932f0","Type":"ContainerStarted","Data":"8472ab5e58c0accfe22cfc5caa08c1c1e9bc47ba9888bfd20261c90565340972"} Mar 12 00:21:42 crc kubenswrapper[4870]: I0312 00:21:42.085048 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" Mar 12 00:21:42 crc kubenswrapper[4870]: I0312 00:21:42.095308 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-h8d8r" podStartSLOduration=4.215599488 podStartE2EDuration="13.095282219s" podCreationTimestamp="2026-03-12 00:21:29 +0000 UTC" firstStartedPulling="2026-03-12 00:21:32.415405558 +0000 UTC m=+783.018821868" lastFinishedPulling="2026-03-12 00:21:41.295088289 +0000 UTC m=+791.898504599" observedRunningTime="2026-03-12 00:21:42.094270489 +0000 UTC m=+792.697686789" watchObservedRunningTime="2026-03-12 00:21:42.095282219 +0000 UTC m=+792.698698559" Mar 12 00:21:42 crc kubenswrapper[4870]: I0312 00:21:42.152601 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" podStartSLOduration=7.278289451 podStartE2EDuration="16.152580311s" podCreationTimestamp="2026-03-12 00:21:26 +0000 UTC" firstStartedPulling="2026-03-12 00:21:32.445246155 +0000 UTC m=+783.048662465" lastFinishedPulling="2026-03-12 00:21:41.319537015 +0000 UTC m=+791.922953325" observedRunningTime="2026-03-12 00:21:42.118640653 +0000 UTC m=+792.722056963" watchObservedRunningTime="2026-03-12 00:21:42.152580311 +0000 UTC m=+792.755996631" Mar 12 00:21:42 crc kubenswrapper[4870]: I0312 00:21:42.156615 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=10.936002544 podStartE2EDuration="27.156604871s" podCreationTimestamp="2026-03-12 00:21:15 +0000 UTC" firstStartedPulling="2026-03-12 00:21:16.011630868 +0000 UTC m=+766.615047178" lastFinishedPulling="2026-03-12 00:21:32.232233195 +0000 UTC m=+782.835649505" observedRunningTime="2026-03-12 00:21:42.148000305 +0000 UTC m=+792.751416625" watchObservedRunningTime="2026-03-12 00:21:42.156604871 +0000 UTC m=+792.760021191" Mar 12 00:21:42 crc kubenswrapper[4870]: I0312 00:21:42.172359 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-g6xml" podStartSLOduration=2.374860126 podStartE2EDuration="5.172338189s" podCreationTimestamp="2026-03-12 00:21:37 +0000 UTC" firstStartedPulling="2026-03-12 00:21:38.499622695 +0000 UTC m=+789.103039045" lastFinishedPulling="2026-03-12 00:21:41.297100798 +0000 UTC m=+791.900517108" observedRunningTime="2026-03-12 00:21:42.171076831 +0000 UTC m=+792.774493141" watchObservedRunningTime="2026-03-12 00:21:42.172338189 +0000 UTC m=+792.775754499" Mar 12 00:21:47 crc kubenswrapper[4870]: I0312 00:21:47.118660 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-rw8vl" Mar 12 00:21:47 crc kubenswrapper[4870]: I0312 00:21:47.595038 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:21:47 crc kubenswrapper[4870]: I0312 00:21:47.595125 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:21:56 crc kubenswrapper[4870]: I0312 00:21:56.097715 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.137396 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554582-8dlv7"] Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.138611 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554582-8dlv7" Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.141745 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.142246 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9fvj8" Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.142324 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.148741 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554582-8dlv7"] Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.253964 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4cn7\" (UniqueName: \"kubernetes.io/projected/51df2778-c451-4b96-9c6f-e21248f0945f-kube-api-access-d4cn7\") pod \"auto-csr-approver-29554582-8dlv7\" (UID: \"51df2778-c451-4b96-9c6f-e21248f0945f\") " pod="openshift-infra/auto-csr-approver-29554582-8dlv7" Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.356241 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4cn7\" (UniqueName: \"kubernetes.io/projected/51df2778-c451-4b96-9c6f-e21248f0945f-kube-api-access-d4cn7\") pod \"auto-csr-approver-29554582-8dlv7\" (UID: \"51df2778-c451-4b96-9c6f-e21248f0945f\") " pod="openshift-infra/auto-csr-approver-29554582-8dlv7" Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.380690 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4cn7\" (UniqueName: \"kubernetes.io/projected/51df2778-c451-4b96-9c6f-e21248f0945f-kube-api-access-d4cn7\") pod \"auto-csr-approver-29554582-8dlv7\" (UID: \"51df2778-c451-4b96-9c6f-e21248f0945f\") " pod="openshift-infra/auto-csr-approver-29554582-8dlv7" Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.454782 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554582-8dlv7" Mar 12 00:22:00 crc kubenswrapper[4870]: I0312 00:22:00.921274 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554582-8dlv7"] Mar 12 00:22:00 crc kubenswrapper[4870]: W0312 00:22:00.929980 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51df2778_c451_4b96_9c6f_e21248f0945f.slice/crio-217a8b206ce9925903e0e8fae06292d4d2792e9b4099fd107b2d37800bf5b1ad WatchSource:0}: Error finding container 217a8b206ce9925903e0e8fae06292d4d2792e9b4099fd107b2d37800bf5b1ad: Status 404 returned error can't find the container with id 217a8b206ce9925903e0e8fae06292d4d2792e9b4099fd107b2d37800bf5b1ad Mar 12 00:22:01 crc kubenswrapper[4870]: I0312 00:22:01.234054 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554582-8dlv7" event={"ID":"51df2778-c451-4b96-9c6f-e21248f0945f","Type":"ContainerStarted","Data":"217a8b206ce9925903e0e8fae06292d4d2792e9b4099fd107b2d37800bf5b1ad"} Mar 12 00:22:01 crc kubenswrapper[4870]: I0312 00:22:01.313063 4870 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 00:22:03 crc kubenswrapper[4870]: I0312 00:22:03.252690 4870 generic.go:334] "Generic (PLEG): container finished" podID="51df2778-c451-4b96-9c6f-e21248f0945f" containerID="bc02f3933e8b2081a12f300268c2f15143928208bc939b386caae7fe9c056ce8" exitCode=0 Mar 12 00:22:03 crc kubenswrapper[4870]: I0312 00:22:03.252797 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554582-8dlv7" event={"ID":"51df2778-c451-4b96-9c6f-e21248f0945f","Type":"ContainerDied","Data":"bc02f3933e8b2081a12f300268c2f15143928208bc939b386caae7fe9c056ce8"} Mar 12 00:22:04 crc kubenswrapper[4870]: I0312 00:22:04.536751 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554582-8dlv7" Mar 12 00:22:04 crc kubenswrapper[4870]: I0312 00:22:04.718461 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4cn7\" (UniqueName: \"kubernetes.io/projected/51df2778-c451-4b96-9c6f-e21248f0945f-kube-api-access-d4cn7\") pod \"51df2778-c451-4b96-9c6f-e21248f0945f\" (UID: \"51df2778-c451-4b96-9c6f-e21248f0945f\") " Mar 12 00:22:04 crc kubenswrapper[4870]: I0312 00:22:04.724343 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51df2778-c451-4b96-9c6f-e21248f0945f-kube-api-access-d4cn7" (OuterVolumeSpecName: "kube-api-access-d4cn7") pod "51df2778-c451-4b96-9c6f-e21248f0945f" (UID: "51df2778-c451-4b96-9c6f-e21248f0945f"). InnerVolumeSpecName "kube-api-access-d4cn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:22:04 crc kubenswrapper[4870]: I0312 00:22:04.819914 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4cn7\" (UniqueName: \"kubernetes.io/projected/51df2778-c451-4b96-9c6f-e21248f0945f-kube-api-access-d4cn7\") on node \"crc\" DevicePath \"\"" Mar 12 00:22:05 crc kubenswrapper[4870]: I0312 00:22:05.270548 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554582-8dlv7" event={"ID":"51df2778-c451-4b96-9c6f-e21248f0945f","Type":"ContainerDied","Data":"217a8b206ce9925903e0e8fae06292d4d2792e9b4099fd107b2d37800bf5b1ad"} Mar 12 00:22:05 crc kubenswrapper[4870]: I0312 00:22:05.270621 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="217a8b206ce9925903e0e8fae06292d4d2792e9b4099fd107b2d37800bf5b1ad" Mar 12 00:22:05 crc kubenswrapper[4870]: I0312 00:22:05.270570 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554582-8dlv7" Mar 12 00:22:05 crc kubenswrapper[4870]: I0312 00:22:05.601216 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554576-mrf2j"] Mar 12 00:22:05 crc kubenswrapper[4870]: I0312 00:22:05.609231 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554576-mrf2j"] Mar 12 00:22:06 crc kubenswrapper[4870]: I0312 00:22:06.114390 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa0258ba-0167-4413-8de7-5b01a8faec96" path="/var/lib/kubelet/pods/aa0258ba-0167-4413-8de7-5b01a8faec96/volumes" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.240384 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7g8cc/must-gather-wx7gh"] Mar 12 00:22:12 crc kubenswrapper[4870]: E0312 00:22:12.241220 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51df2778-c451-4b96-9c6f-e21248f0945f" containerName="oc" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.241235 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="51df2778-c451-4b96-9c6f-e21248f0945f" containerName="oc" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.241403 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="51df2778-c451-4b96-9c6f-e21248f0945f" containerName="oc" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.242088 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7g8cc/must-gather-wx7gh" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.257225 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7g8cc"/"default-dockercfg-zc4sz" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.260045 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7g8cc"/"kube-root-ca.crt" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.262711 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7g8cc"/"openshift-service-ca.crt" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.284662 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7g8cc/must-gather-wx7gh"] Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.425242 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/77867a54-1bc3-485c-a7b3-0975e8cdfd46-must-gather-output\") pod \"must-gather-wx7gh\" (UID: \"77867a54-1bc3-485c-a7b3-0975e8cdfd46\") " pod="openshift-must-gather-7g8cc/must-gather-wx7gh" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.425361 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8f5k\" (UniqueName: \"kubernetes.io/projected/77867a54-1bc3-485c-a7b3-0975e8cdfd46-kube-api-access-q8f5k\") pod \"must-gather-wx7gh\" (UID: \"77867a54-1bc3-485c-a7b3-0975e8cdfd46\") " pod="openshift-must-gather-7g8cc/must-gather-wx7gh" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.526785 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8f5k\" (UniqueName: \"kubernetes.io/projected/77867a54-1bc3-485c-a7b3-0975e8cdfd46-kube-api-access-q8f5k\") pod \"must-gather-wx7gh\" (UID: \"77867a54-1bc3-485c-a7b3-0975e8cdfd46\") " pod="openshift-must-gather-7g8cc/must-gather-wx7gh" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.526853 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/77867a54-1bc3-485c-a7b3-0975e8cdfd46-must-gather-output\") pod \"must-gather-wx7gh\" (UID: \"77867a54-1bc3-485c-a7b3-0975e8cdfd46\") " pod="openshift-must-gather-7g8cc/must-gather-wx7gh" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.527350 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/77867a54-1bc3-485c-a7b3-0975e8cdfd46-must-gather-output\") pod \"must-gather-wx7gh\" (UID: \"77867a54-1bc3-485c-a7b3-0975e8cdfd46\") " pod="openshift-must-gather-7g8cc/must-gather-wx7gh" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.548332 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8f5k\" (UniqueName: \"kubernetes.io/projected/77867a54-1bc3-485c-a7b3-0975e8cdfd46-kube-api-access-q8f5k\") pod \"must-gather-wx7gh\" (UID: \"77867a54-1bc3-485c-a7b3-0975e8cdfd46\") " pod="openshift-must-gather-7g8cc/must-gather-wx7gh" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.558929 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7g8cc/must-gather-wx7gh" Mar 12 00:22:12 crc kubenswrapper[4870]: I0312 00:22:12.756112 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7g8cc/must-gather-wx7gh"] Mar 12 00:22:13 crc kubenswrapper[4870]: I0312 00:22:13.342357 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7g8cc/must-gather-wx7gh" event={"ID":"77867a54-1bc3-485c-a7b3-0975e8cdfd46","Type":"ContainerStarted","Data":"357d1518d7af3d3de7d3ae795ef918f3243411244444037a274de63b370133e4"} Mar 12 00:22:17 crc kubenswrapper[4870]: I0312 00:22:17.595048 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:22:17 crc kubenswrapper[4870]: I0312 00:22:17.595484 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:22:17 crc kubenswrapper[4870]: I0312 00:22:17.595553 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:22:17 crc kubenswrapper[4870]: I0312 00:22:17.596346 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d34e3dbb71186ce8356c02e5bee2ab1ff708583b71cba126470e3c14ba16321"} pod="openshift-machine-config-operator/machine-config-daemon-84dfr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 00:22:17 crc kubenswrapper[4870]: I0312 00:22:17.596421 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" containerID="cri-o://9d34e3dbb71186ce8356c02e5bee2ab1ff708583b71cba126470e3c14ba16321" gracePeriod=600 Mar 12 00:22:18 crc kubenswrapper[4870]: I0312 00:22:18.376043 4870 generic.go:334] "Generic (PLEG): container finished" podID="988c0290-1e98-46c8-8253-a4718914b9ef" containerID="9d34e3dbb71186ce8356c02e5bee2ab1ff708583b71cba126470e3c14ba16321" exitCode=0 Mar 12 00:22:18 crc kubenswrapper[4870]: I0312 00:22:18.376086 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerDied","Data":"9d34e3dbb71186ce8356c02e5bee2ab1ff708583b71cba126470e3c14ba16321"} Mar 12 00:22:18 crc kubenswrapper[4870]: I0312 00:22:18.376155 4870 scope.go:117] "RemoveContainer" containerID="1741f7c30d6275bdbc591187e9d7f1701084fc4106be15d405493726cd83c068" Mar 12 00:22:20 crc kubenswrapper[4870]: I0312 00:22:20.390967 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerStarted","Data":"31940fcb2bdca3b9e93d8f4a5594da9981665262ae31be473e93140ad64f407d"} Mar 12 00:22:21 crc kubenswrapper[4870]: I0312 00:22:21.400209 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7g8cc/must-gather-wx7gh" event={"ID":"77867a54-1bc3-485c-a7b3-0975e8cdfd46","Type":"ContainerStarted","Data":"9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d"} Mar 12 00:22:21 crc kubenswrapper[4870]: I0312 00:22:21.400617 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7g8cc/must-gather-wx7gh" event={"ID":"77867a54-1bc3-485c-a7b3-0975e8cdfd46","Type":"ContainerStarted","Data":"fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100"} Mar 12 00:22:21 crc kubenswrapper[4870]: I0312 00:22:21.430826 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7g8cc/must-gather-wx7gh" podStartSLOduration=1.8140306480000001 podStartE2EDuration="9.430788859s" podCreationTimestamp="2026-03-12 00:22:12 +0000 UTC" firstStartedPulling="2026-03-12 00:22:12.767328844 +0000 UTC m=+823.370745154" lastFinishedPulling="2026-03-12 00:22:20.384087055 +0000 UTC m=+830.987503365" observedRunningTime="2026-03-12 00:22:21.42441354 +0000 UTC m=+832.027829850" watchObservedRunningTime="2026-03-12 00:22:21.430788859 +0000 UTC m=+832.034205209" Mar 12 00:22:32 crc kubenswrapper[4870]: I0312 00:22:32.055345 4870 scope.go:117] "RemoveContainer" containerID="527647a7d31cbda72fa8e802ea76c84fdf0213032e30a9a5f1c782c1800b1c27" Mar 12 00:23:03 crc kubenswrapper[4870]: I0312 00:23:03.356391 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2bvpw_aded5c32-6731-43cc-8701-4d847d663dd2/control-plane-machine-set-operator/0.log" Mar 12 00:23:03 crc kubenswrapper[4870]: I0312 00:23:03.483831 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kvh84_3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a/kube-rbac-proxy/0.log" Mar 12 00:23:03 crc kubenswrapper[4870]: I0312 00:23:03.545792 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kvh84_3ff417d8-c2c5-40bf-bc0a-2718a9f88e2a/machine-api-operator/0.log" Mar 12 00:23:15 crc kubenswrapper[4870]: I0312 00:23:15.955310 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-g6xml_e826b4ba-aa17-464b-ae23-5d9cce47be51/cert-manager-controller/0.log" Mar 12 00:23:16 crc kubenswrapper[4870]: I0312 00:23:16.101952 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-h8d8r_7e3bc318-2b39-431b-8de9-7479fe510c00/cert-manager-cainjector/0.log" Mar 12 00:23:16 crc kubenswrapper[4870]: I0312 00:23:16.105070 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-rw8vl_95c7f5b8-1aad-4f77-8503-20afcfa932f0/cert-manager-webhook/0.log" Mar 12 00:23:31 crc kubenswrapper[4870]: I0312 00:23:31.541756 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-zttt2_8ac96e15-a84c-4d36-a106-a5a786ecc075/prometheus-operator/0.log" Mar 12 00:23:31 crc kubenswrapper[4870]: I0312 00:23:31.656414 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j_f44c8e76-1dd9-45f3-a81b-187476812817/prometheus-operator-admission-webhook/0.log" Mar 12 00:23:31 crc kubenswrapper[4870]: I0312 00:23:31.693830 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z_b9b62e7c-69f6-4997-bdb4-f342b7ac04eb/prometheus-operator-admission-webhook/0.log" Mar 12 00:23:31 crc kubenswrapper[4870]: I0312 00:23:31.805400 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-vnh6n_a807a2a6-682f-4d87-816a-fe0a8ca96410/operator/0.log" Mar 12 00:23:31 crc kubenswrapper[4870]: I0312 00:23:31.853966 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tdzmm_8a679de6-3b5b-4ac3-a070-201c270ec629/perses-operator/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.249479 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp_812a958c-4d49-4c60-b2c8-34702f4ec92f/util/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.384856 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp_812a958c-4d49-4c60-b2c8-34702f4ec92f/util/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.457023 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp_812a958c-4d49-4c60-b2c8-34702f4ec92f/pull/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.465708 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp_812a958c-4d49-4c60-b2c8-34702f4ec92f/pull/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.589640 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp_812a958c-4d49-4c60-b2c8-34702f4ec92f/util/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.594964 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp_812a958c-4d49-4c60-b2c8-34702f4ec92f/pull/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.621951 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exgsgp_812a958c-4d49-4c60-b2c8-34702f4ec92f/extract/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.747923 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l_6e7a1769-d2e9-4230-9964-8c2b0162a247/util/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.932725 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l_6e7a1769-d2e9-4230-9964-8c2b0162a247/util/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.980271 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l_6e7a1769-d2e9-4230-9964-8c2b0162a247/pull/0.log" Mar 12 00:23:46 crc kubenswrapper[4870]: I0312 00:23:46.984987 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l_6e7a1769-d2e9-4230-9964-8c2b0162a247/pull/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.100031 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l_6e7a1769-d2e9-4230-9964-8c2b0162a247/util/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.112277 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l_6e7a1769-d2e9-4230-9964-8c2b0162a247/pull/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.154512 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55t94l_6e7a1769-d2e9-4230-9964-8c2b0162a247/extract/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.256712 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d_c8c66009-c5f6-417d-8250-90e3b514c5ab/util/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.393439 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d_c8c66009-c5f6-417d-8250-90e3b514c5ab/pull/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.406948 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d_c8c66009-c5f6-417d-8250-90e3b514c5ab/pull/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.455177 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d_c8c66009-c5f6-417d-8250-90e3b514c5ab/util/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.584613 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d_c8c66009-c5f6-417d-8250-90e3b514c5ab/pull/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.620696 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d_c8c66009-c5f6-417d-8250-90e3b514c5ab/util/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.634749 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t9h2d_c8c66009-c5f6-417d-8250-90e3b514c5ab/extract/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.773006 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s5gpv_ca7a7ade-abe4-463f-937d-e6c399cdf72c/extract-utilities/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.973632 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s5gpv_ca7a7ade-abe4-463f-937d-e6c399cdf72c/extract-utilities/0.log" Mar 12 00:23:47 crc kubenswrapper[4870]: I0312 00:23:47.993603 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s5gpv_ca7a7ade-abe4-463f-937d-e6c399cdf72c/extract-content/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.027571 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s5gpv_ca7a7ade-abe4-463f-937d-e6c399cdf72c/extract-content/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.195937 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s5gpv_ca7a7ade-abe4-463f-937d-e6c399cdf72c/extract-utilities/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.236725 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s5gpv_ca7a7ade-abe4-463f-937d-e6c399cdf72c/extract-content/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.532192 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k6pq4_4f1fb5d4-41a1-4908-a685-974af39fbbc5/extract-utilities/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.604683 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s5gpv_ca7a7ade-abe4-463f-937d-e6c399cdf72c/registry-server/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.698722 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k6pq4_4f1fb5d4-41a1-4908-a685-974af39fbbc5/extract-utilities/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.724380 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k6pq4_4f1fb5d4-41a1-4908-a685-974af39fbbc5/extract-content/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.744533 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k6pq4_4f1fb5d4-41a1-4908-a685-974af39fbbc5/extract-content/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.869589 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k6pq4_4f1fb5d4-41a1-4908-a685-974af39fbbc5/extract-utilities/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.951057 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k6pq4_4f1fb5d4-41a1-4908-a685-974af39fbbc5/registry-server/0.log" Mar 12 00:23:48 crc kubenswrapper[4870]: I0312 00:23:48.970317 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k6pq4_4f1fb5d4-41a1-4908-a685-974af39fbbc5/extract-content/0.log" Mar 12 00:23:49 crc kubenswrapper[4870]: I0312 00:23:49.120765 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-72wh5_a0f4d065-12c4-4d6c-aa9e-56560911ed54/marketplace-operator/0.log" Mar 12 00:23:49 crc kubenswrapper[4870]: I0312 00:23:49.139237 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2sbw_6493de17-4588-4ee6-8d01-ad464fbc01a4/extract-utilities/0.log" Mar 12 00:23:49 crc kubenswrapper[4870]: I0312 00:23:49.329161 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2sbw_6493de17-4588-4ee6-8d01-ad464fbc01a4/extract-utilities/0.log" Mar 12 00:23:49 crc kubenswrapper[4870]: I0312 00:23:49.354916 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2sbw_6493de17-4588-4ee6-8d01-ad464fbc01a4/extract-content/0.log" Mar 12 00:23:49 crc kubenswrapper[4870]: I0312 00:23:49.367118 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2sbw_6493de17-4588-4ee6-8d01-ad464fbc01a4/extract-content/0.log" Mar 12 00:23:49 crc kubenswrapper[4870]: I0312 00:23:49.532502 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2sbw_6493de17-4588-4ee6-8d01-ad464fbc01a4/extract-utilities/0.log" Mar 12 00:23:49 crc kubenswrapper[4870]: I0312 00:23:49.580472 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2sbw_6493de17-4588-4ee6-8d01-ad464fbc01a4/extract-content/0.log" Mar 12 00:23:49 crc kubenswrapper[4870]: I0312 00:23:49.681212 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d2sbw_6493de17-4588-4ee6-8d01-ad464fbc01a4/registry-server/0.log" Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.140443 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554584-xfc8p"] Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.141872 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554584-xfc8p" Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.144188 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.144481 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.145856 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9fvj8" Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.147985 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554584-xfc8p"] Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.307188 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rflbj\" (UniqueName: \"kubernetes.io/projected/c049927c-1539-46da-a585-d6c18ea2fb3a-kube-api-access-rflbj\") pod \"auto-csr-approver-29554584-xfc8p\" (UID: \"c049927c-1539-46da-a585-d6c18ea2fb3a\") " pod="openshift-infra/auto-csr-approver-29554584-xfc8p" Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.408525 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rflbj\" (UniqueName: \"kubernetes.io/projected/c049927c-1539-46da-a585-d6c18ea2fb3a-kube-api-access-rflbj\") pod \"auto-csr-approver-29554584-xfc8p\" (UID: \"c049927c-1539-46da-a585-d6c18ea2fb3a\") " pod="openshift-infra/auto-csr-approver-29554584-xfc8p" Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.432846 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rflbj\" (UniqueName: \"kubernetes.io/projected/c049927c-1539-46da-a585-d6c18ea2fb3a-kube-api-access-rflbj\") pod \"auto-csr-approver-29554584-xfc8p\" (UID: \"c049927c-1539-46da-a585-d6c18ea2fb3a\") " pod="openshift-infra/auto-csr-approver-29554584-xfc8p" Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.460327 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554584-xfc8p" Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.769261 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554584-xfc8p"] Mar 12 00:24:00 crc kubenswrapper[4870]: I0312 00:24:00.785279 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 00:24:01 crc kubenswrapper[4870]: I0312 00:24:01.074037 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554584-xfc8p" event={"ID":"c049927c-1539-46da-a585-d6c18ea2fb3a","Type":"ContainerStarted","Data":"80566aca5258544e159caad8de08ec97fbad5a039ee9a77c9b4d0d5889d300eb"} Mar 12 00:24:02 crc kubenswrapper[4870]: I0312 00:24:02.967787 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-zttt2_8ac96e15-a84c-4d36-a106-a5a786ecc075/prometheus-operator/0.log" Mar 12 00:24:02 crc kubenswrapper[4870]: I0312 00:24:02.998158 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7fc8b5bf78-7lh8j_f44c8e76-1dd9-45f3-a81b-187476812817/prometheus-operator-admission-webhook/0.log" Mar 12 00:24:03 crc kubenswrapper[4870]: I0312 00:24:03.063940 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7fc8b5bf78-vgd7z_b9b62e7c-69f6-4997-bdb4-f342b7ac04eb/prometheus-operator-admission-webhook/0.log" Mar 12 00:24:03 crc kubenswrapper[4870]: I0312 00:24:03.085533 4870 generic.go:334] "Generic (PLEG): container finished" podID="c049927c-1539-46da-a585-d6c18ea2fb3a" containerID="3a04602e6dfc0d696e752e66046d8ff72f0243f3f226769670514867d501d9c2" exitCode=0 Mar 12 00:24:03 crc kubenswrapper[4870]: I0312 00:24:03.085580 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554584-xfc8p" event={"ID":"c049927c-1539-46da-a585-d6c18ea2fb3a","Type":"ContainerDied","Data":"3a04602e6dfc0d696e752e66046d8ff72f0243f3f226769670514867d501d9c2"} Mar 12 00:24:03 crc kubenswrapper[4870]: I0312 00:24:03.134677 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tdzmm_8a679de6-3b5b-4ac3-a070-201c270ec629/perses-operator/0.log" Mar 12 00:24:03 crc kubenswrapper[4870]: I0312 00:24:03.190076 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-vnh6n_a807a2a6-682f-4d87-816a-fe0a8ca96410/operator/0.log" Mar 12 00:24:04 crc kubenswrapper[4870]: I0312 00:24:04.429706 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554584-xfc8p" Mar 12 00:24:04 crc kubenswrapper[4870]: I0312 00:24:04.581619 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rflbj\" (UniqueName: \"kubernetes.io/projected/c049927c-1539-46da-a585-d6c18ea2fb3a-kube-api-access-rflbj\") pod \"c049927c-1539-46da-a585-d6c18ea2fb3a\" (UID: \"c049927c-1539-46da-a585-d6c18ea2fb3a\") " Mar 12 00:24:04 crc kubenswrapper[4870]: I0312 00:24:04.594841 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c049927c-1539-46da-a585-d6c18ea2fb3a-kube-api-access-rflbj" (OuterVolumeSpecName: "kube-api-access-rflbj") pod "c049927c-1539-46da-a585-d6c18ea2fb3a" (UID: "c049927c-1539-46da-a585-d6c18ea2fb3a"). InnerVolumeSpecName "kube-api-access-rflbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:24:04 crc kubenswrapper[4870]: I0312 00:24:04.683286 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rflbj\" (UniqueName: \"kubernetes.io/projected/c049927c-1539-46da-a585-d6c18ea2fb3a-kube-api-access-rflbj\") on node \"crc\" DevicePath \"\"" Mar 12 00:24:05 crc kubenswrapper[4870]: I0312 00:24:05.098640 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554584-xfc8p" event={"ID":"c049927c-1539-46da-a585-d6c18ea2fb3a","Type":"ContainerDied","Data":"80566aca5258544e159caad8de08ec97fbad5a039ee9a77c9b4d0d5889d300eb"} Mar 12 00:24:05 crc kubenswrapper[4870]: I0312 00:24:05.098681 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80566aca5258544e159caad8de08ec97fbad5a039ee9a77c9b4d0d5889d300eb" Mar 12 00:24:05 crc kubenswrapper[4870]: I0312 00:24:05.098686 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554584-xfc8p" Mar 12 00:24:05 crc kubenswrapper[4870]: I0312 00:24:05.490295 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554578-zb5pv"] Mar 12 00:24:05 crc kubenswrapper[4870]: I0312 00:24:05.493908 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554578-zb5pv"] Mar 12 00:24:06 crc kubenswrapper[4870]: I0312 00:24:06.111073 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00368cf7-b70c-425e-843a-f57d1ed13c51" path="/var/lib/kubelet/pods/00368cf7-b70c-425e-843a-f57d1ed13c51/volumes" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.612037 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qx5zc"] Mar 12 00:24:16 crc kubenswrapper[4870]: E0312 00:24:16.612897 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c049927c-1539-46da-a585-d6c18ea2fb3a" containerName="oc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.612913 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c049927c-1539-46da-a585-d6c18ea2fb3a" containerName="oc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.613036 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c049927c-1539-46da-a585-d6c18ea2fb3a" containerName="oc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.613961 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.628682 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qx5zc"] Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.643439 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-catalog-content\") pod \"redhat-operators-qx5zc\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.643612 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbhbs\" (UniqueName: \"kubernetes.io/projected/27c73847-da80-4ef3-921f-ce7b3230533c-kube-api-access-mbhbs\") pod \"redhat-operators-qx5zc\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.643726 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-utilities\") pod \"redhat-operators-qx5zc\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.744762 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-catalog-content\") pod \"redhat-operators-qx5zc\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.745031 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbhbs\" (UniqueName: \"kubernetes.io/projected/27c73847-da80-4ef3-921f-ce7b3230533c-kube-api-access-mbhbs\") pod \"redhat-operators-qx5zc\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.745232 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-utilities\") pod \"redhat-operators-qx5zc\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.745574 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-catalog-content\") pod \"redhat-operators-qx5zc\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.745698 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-utilities\") pod \"redhat-operators-qx5zc\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.763363 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbhbs\" (UniqueName: \"kubernetes.io/projected/27c73847-da80-4ef3-921f-ce7b3230533c-kube-api-access-mbhbs\") pod \"redhat-operators-qx5zc\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:16 crc kubenswrapper[4870]: I0312 00:24:16.937464 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:17 crc kubenswrapper[4870]: I0312 00:24:17.134970 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qx5zc"] Mar 12 00:24:17 crc kubenswrapper[4870]: I0312 00:24:17.195288 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx5zc" event={"ID":"27c73847-da80-4ef3-921f-ce7b3230533c","Type":"ContainerStarted","Data":"3e9138bd8a685b149eeb49f709a37a8588188447385e84e145866d391cf19761"} Mar 12 00:24:18 crc kubenswrapper[4870]: I0312 00:24:18.204520 4870 generic.go:334] "Generic (PLEG): container finished" podID="27c73847-da80-4ef3-921f-ce7b3230533c" containerID="854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7" exitCode=0 Mar 12 00:24:18 crc kubenswrapper[4870]: I0312 00:24:18.204574 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx5zc" event={"ID":"27c73847-da80-4ef3-921f-ce7b3230533c","Type":"ContainerDied","Data":"854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7"} Mar 12 00:24:19 crc kubenswrapper[4870]: I0312 00:24:19.215958 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx5zc" event={"ID":"27c73847-da80-4ef3-921f-ce7b3230533c","Type":"ContainerStarted","Data":"d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6"} Mar 12 00:24:20 crc kubenswrapper[4870]: I0312 00:24:20.227362 4870 generic.go:334] "Generic (PLEG): container finished" podID="27c73847-da80-4ef3-921f-ce7b3230533c" containerID="d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6" exitCode=0 Mar 12 00:24:20 crc kubenswrapper[4870]: I0312 00:24:20.227439 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx5zc" event={"ID":"27c73847-da80-4ef3-921f-ce7b3230533c","Type":"ContainerDied","Data":"d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6"} Mar 12 00:24:21 crc kubenswrapper[4870]: I0312 00:24:21.239011 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx5zc" event={"ID":"27c73847-da80-4ef3-921f-ce7b3230533c","Type":"ContainerStarted","Data":"868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8"} Mar 12 00:24:21 crc kubenswrapper[4870]: I0312 00:24:21.266608 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qx5zc" podStartSLOduration=2.805133118 podStartE2EDuration="5.26658402s" podCreationTimestamp="2026-03-12 00:24:16 +0000 UTC" firstStartedPulling="2026-03-12 00:24:18.206322717 +0000 UTC m=+948.809739037" lastFinishedPulling="2026-03-12 00:24:20.667773619 +0000 UTC m=+951.271189939" observedRunningTime="2026-03-12 00:24:21.26068117 +0000 UTC m=+951.864097510" watchObservedRunningTime="2026-03-12 00:24:21.26658402 +0000 UTC m=+951.870000370" Mar 12 00:24:26 crc kubenswrapper[4870]: I0312 00:24:26.937903 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:26 crc kubenswrapper[4870]: I0312 00:24:26.938333 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:27 crc kubenswrapper[4870]: I0312 00:24:27.985376 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qx5zc" podUID="27c73847-da80-4ef3-921f-ce7b3230533c" containerName="registry-server" probeResult="failure" output=< Mar 12 00:24:27 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Mar 12 00:24:27 crc kubenswrapper[4870]: > Mar 12 00:24:32 crc kubenswrapper[4870]: I0312 00:24:32.131588 4870 scope.go:117] "RemoveContainer" containerID="d89b172eb80071e840719224a556eb15433e50234f22f512951e9715781646ca" Mar 12 00:24:36 crc kubenswrapper[4870]: I0312 00:24:36.990122 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:37 crc kubenswrapper[4870]: I0312 00:24:37.045964 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:37 crc kubenswrapper[4870]: I0312 00:24:37.246266 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qx5zc"] Mar 12 00:24:38 crc kubenswrapper[4870]: I0312 00:24:38.368762 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qx5zc" podUID="27c73847-da80-4ef3-921f-ce7b3230533c" containerName="registry-server" containerID="cri-o://868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8" gracePeriod=2 Mar 12 00:24:38 crc kubenswrapper[4870]: I0312 00:24:38.783665 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:38 crc kubenswrapper[4870]: I0312 00:24:38.956593 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbhbs\" (UniqueName: \"kubernetes.io/projected/27c73847-da80-4ef3-921f-ce7b3230533c-kube-api-access-mbhbs\") pod \"27c73847-da80-4ef3-921f-ce7b3230533c\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " Mar 12 00:24:38 crc kubenswrapper[4870]: I0312 00:24:38.956691 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-catalog-content\") pod \"27c73847-da80-4ef3-921f-ce7b3230533c\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " Mar 12 00:24:38 crc kubenswrapper[4870]: I0312 00:24:38.956828 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-utilities\") pod \"27c73847-da80-4ef3-921f-ce7b3230533c\" (UID: \"27c73847-da80-4ef3-921f-ce7b3230533c\") " Mar 12 00:24:38 crc kubenswrapper[4870]: I0312 00:24:38.957811 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-utilities" (OuterVolumeSpecName: "utilities") pod "27c73847-da80-4ef3-921f-ce7b3230533c" (UID: "27c73847-da80-4ef3-921f-ce7b3230533c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:24:38 crc kubenswrapper[4870]: I0312 00:24:38.980913 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c73847-da80-4ef3-921f-ce7b3230533c-kube-api-access-mbhbs" (OuterVolumeSpecName: "kube-api-access-mbhbs") pod "27c73847-da80-4ef3-921f-ce7b3230533c" (UID: "27c73847-da80-4ef3-921f-ce7b3230533c"). InnerVolumeSpecName "kube-api-access-mbhbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.058942 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbhbs\" (UniqueName: \"kubernetes.io/projected/27c73847-da80-4ef3-921f-ce7b3230533c-kube-api-access-mbhbs\") on node \"crc\" DevicePath \"\"" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.059354 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.125944 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27c73847-da80-4ef3-921f-ce7b3230533c" (UID: "27c73847-da80-4ef3-921f-ce7b3230533c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.161167 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c73847-da80-4ef3-921f-ce7b3230533c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.385813 4870 generic.go:334] "Generic (PLEG): container finished" podID="27c73847-da80-4ef3-921f-ce7b3230533c" containerID="868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8" exitCode=0 Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.385875 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx5zc" event={"ID":"27c73847-da80-4ef3-921f-ce7b3230533c","Type":"ContainerDied","Data":"868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8"} Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.385915 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx5zc" event={"ID":"27c73847-da80-4ef3-921f-ce7b3230533c","Type":"ContainerDied","Data":"3e9138bd8a685b149eeb49f709a37a8588188447385e84e145866d391cf19761"} Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.385947 4870 scope.go:117] "RemoveContainer" containerID="868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.385978 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx5zc" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.407834 4870 scope.go:117] "RemoveContainer" containerID="d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.433314 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qx5zc"] Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.437706 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qx5zc"] Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.462954 4870 scope.go:117] "RemoveContainer" containerID="854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.489859 4870 scope.go:117] "RemoveContainer" containerID="868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8" Mar 12 00:24:39 crc kubenswrapper[4870]: E0312 00:24:39.490324 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8\": container with ID starting with 868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8 not found: ID does not exist" containerID="868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.490364 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8"} err="failed to get container status \"868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8\": rpc error: code = NotFound desc = could not find container \"868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8\": container with ID starting with 868668176ecd567fd434f25eeff9c775817a6eb27a409d8dda73c054e61e6db8 not found: ID does not exist" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.490390 4870 scope.go:117] "RemoveContainer" containerID="d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6" Mar 12 00:24:39 crc kubenswrapper[4870]: E0312 00:24:39.490768 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6\": container with ID starting with d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6 not found: ID does not exist" containerID="d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.490828 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6"} err="failed to get container status \"d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6\": rpc error: code = NotFound desc = could not find container \"d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6\": container with ID starting with d9daf3ee41b8cf788e24e7fafad7f578f5ffc77989ce18e11db49413549a29f6 not found: ID does not exist" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.490868 4870 scope.go:117] "RemoveContainer" containerID="854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7" Mar 12 00:24:39 crc kubenswrapper[4870]: E0312 00:24:39.491223 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7\": container with ID starting with 854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7 not found: ID does not exist" containerID="854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7" Mar 12 00:24:39 crc kubenswrapper[4870]: I0312 00:24:39.491243 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7"} err="failed to get container status \"854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7\": rpc error: code = NotFound desc = could not find container \"854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7\": container with ID starting with 854952cce6cc220122ad5fd6bda9cbcbb00e547657d40e5c2d9d6289c2f79fc7 not found: ID does not exist" Mar 12 00:24:40 crc kubenswrapper[4870]: I0312 00:24:40.114254 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27c73847-da80-4ef3-921f-ce7b3230533c" path="/var/lib/kubelet/pods/27c73847-da80-4ef3-921f-ce7b3230533c/volumes" Mar 12 00:24:47 crc kubenswrapper[4870]: I0312 00:24:47.594754 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:24:47 crc kubenswrapper[4870]: I0312 00:24:47.595202 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:24:48 crc kubenswrapper[4870]: I0312 00:24:48.805497 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gznsz"] Mar 12 00:24:48 crc kubenswrapper[4870]: E0312 00:24:48.806124 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c73847-da80-4ef3-921f-ce7b3230533c" containerName="extract-content" Mar 12 00:24:48 crc kubenswrapper[4870]: I0312 00:24:48.806159 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c73847-da80-4ef3-921f-ce7b3230533c" containerName="extract-content" Mar 12 00:24:48 crc kubenswrapper[4870]: E0312 00:24:48.806183 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c73847-da80-4ef3-921f-ce7b3230533c" containerName="extract-utilities" Mar 12 00:24:48 crc kubenswrapper[4870]: I0312 00:24:48.806191 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c73847-da80-4ef3-921f-ce7b3230533c" containerName="extract-utilities" Mar 12 00:24:48 crc kubenswrapper[4870]: E0312 00:24:48.806219 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c73847-da80-4ef3-921f-ce7b3230533c" containerName="registry-server" Mar 12 00:24:48 crc kubenswrapper[4870]: I0312 00:24:48.806227 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c73847-da80-4ef3-921f-ce7b3230533c" containerName="registry-server" Mar 12 00:24:48 crc kubenswrapper[4870]: I0312 00:24:48.806373 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="27c73847-da80-4ef3-921f-ce7b3230533c" containerName="registry-server" Mar 12 00:24:48 crc kubenswrapper[4870]: I0312 00:24:48.807855 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:48 crc kubenswrapper[4870]: I0312 00:24:48.868096 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gznsz"] Mar 12 00:24:48 crc kubenswrapper[4870]: I0312 00:24:48.907250 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-utilities\") pod \"community-operators-gznsz\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:48 crc kubenswrapper[4870]: I0312 00:24:48.907323 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-catalog-content\") pod \"community-operators-gznsz\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:48 crc kubenswrapper[4870]: I0312 00:24:48.907591 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67hfl\" (UniqueName: \"kubernetes.io/projected/bad79eae-c77b-4ab6-ac3f-048e08929cfc-kube-api-access-67hfl\") pod \"community-operators-gznsz\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:49 crc kubenswrapper[4870]: I0312 00:24:49.008447 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-utilities\") pod \"community-operators-gznsz\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:49 crc kubenswrapper[4870]: I0312 00:24:49.008548 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-catalog-content\") pod \"community-operators-gznsz\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:49 crc kubenswrapper[4870]: I0312 00:24:49.008613 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67hfl\" (UniqueName: \"kubernetes.io/projected/bad79eae-c77b-4ab6-ac3f-048e08929cfc-kube-api-access-67hfl\") pod \"community-operators-gznsz\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:49 crc kubenswrapper[4870]: I0312 00:24:49.009019 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-utilities\") pod \"community-operators-gznsz\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:49 crc kubenswrapper[4870]: I0312 00:24:49.009049 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-catalog-content\") pod \"community-operators-gznsz\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:49 crc kubenswrapper[4870]: I0312 00:24:49.036969 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67hfl\" (UniqueName: \"kubernetes.io/projected/bad79eae-c77b-4ab6-ac3f-048e08929cfc-kube-api-access-67hfl\") pod \"community-operators-gznsz\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:49 crc kubenswrapper[4870]: I0312 00:24:49.135838 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:49 crc kubenswrapper[4870]: I0312 00:24:49.417178 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gznsz"] Mar 12 00:24:49 crc kubenswrapper[4870]: W0312 00:24:49.434044 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbad79eae_c77b_4ab6_ac3f_048e08929cfc.slice/crio-476e492d10d39fadb297214ea046da46174310964e56d224832150fe087e5090 WatchSource:0}: Error finding container 476e492d10d39fadb297214ea046da46174310964e56d224832150fe087e5090: Status 404 returned error can't find the container with id 476e492d10d39fadb297214ea046da46174310964e56d224832150fe087e5090 Mar 12 00:24:49 crc kubenswrapper[4870]: I0312 00:24:49.466561 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gznsz" event={"ID":"bad79eae-c77b-4ab6-ac3f-048e08929cfc","Type":"ContainerStarted","Data":"476e492d10d39fadb297214ea046da46174310964e56d224832150fe087e5090"} Mar 12 00:24:50 crc kubenswrapper[4870]: I0312 00:24:50.475276 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gznsz" event={"ID":"bad79eae-c77b-4ab6-ac3f-048e08929cfc","Type":"ContainerDied","Data":"13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea"} Mar 12 00:24:50 crc kubenswrapper[4870]: I0312 00:24:50.474824 4870 generic.go:334] "Generic (PLEG): container finished" podID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerID="13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea" exitCode=0 Mar 12 00:24:52 crc kubenswrapper[4870]: I0312 00:24:52.491278 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gznsz" event={"ID":"bad79eae-c77b-4ab6-ac3f-048e08929cfc","Type":"ContainerStarted","Data":"4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405"} Mar 12 00:24:52 crc kubenswrapper[4870]: E0312 00:24:52.657342 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbad79eae_c77b_4ab6_ac3f_048e08929cfc.slice/crio-4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405.scope\": RecentStats: unable to find data in memory cache]" Mar 12 00:24:53 crc kubenswrapper[4870]: I0312 00:24:53.499967 4870 generic.go:334] "Generic (PLEG): container finished" podID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerID="4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405" exitCode=0 Mar 12 00:24:53 crc kubenswrapper[4870]: I0312 00:24:53.500026 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gznsz" event={"ID":"bad79eae-c77b-4ab6-ac3f-048e08929cfc","Type":"ContainerDied","Data":"4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405"} Mar 12 00:24:55 crc kubenswrapper[4870]: I0312 00:24:55.514939 4870 generic.go:334] "Generic (PLEG): container finished" podID="77867a54-1bc3-485c-a7b3-0975e8cdfd46" containerID="fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100" exitCode=0 Mar 12 00:24:55 crc kubenswrapper[4870]: I0312 00:24:55.515027 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7g8cc/must-gather-wx7gh" event={"ID":"77867a54-1bc3-485c-a7b3-0975e8cdfd46","Type":"ContainerDied","Data":"fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100"} Mar 12 00:24:55 crc kubenswrapper[4870]: I0312 00:24:55.516095 4870 scope.go:117] "RemoveContainer" containerID="fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100" Mar 12 00:24:55 crc kubenswrapper[4870]: I0312 00:24:55.517866 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gznsz" event={"ID":"bad79eae-c77b-4ab6-ac3f-048e08929cfc","Type":"ContainerStarted","Data":"fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246"} Mar 12 00:24:55 crc kubenswrapper[4870]: I0312 00:24:55.562029 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gznsz" podStartSLOduration=3.127227746 podStartE2EDuration="7.562004237s" podCreationTimestamp="2026-03-12 00:24:48 +0000 UTC" firstStartedPulling="2026-03-12 00:24:50.477229924 +0000 UTC m=+981.080646234" lastFinishedPulling="2026-03-12 00:24:54.912006375 +0000 UTC m=+985.515422725" observedRunningTime="2026-03-12 00:24:55.555939692 +0000 UTC m=+986.159356022" watchObservedRunningTime="2026-03-12 00:24:55.562004237 +0000 UTC m=+986.165420587" Mar 12 00:24:55 crc kubenswrapper[4870]: I0312 00:24:55.942471 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7g8cc_must-gather-wx7gh_77867a54-1bc3-485c-a7b3-0975e8cdfd46/gather/0.log" Mar 12 00:24:59 crc kubenswrapper[4870]: I0312 00:24:59.136385 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:59 crc kubenswrapper[4870]: I0312 00:24:59.136884 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:24:59 crc kubenswrapper[4870]: I0312 00:24:59.196648 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.003714 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7g8cc/must-gather-wx7gh"] Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.004296 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7g8cc/must-gather-wx7gh" podUID="77867a54-1bc3-485c-a7b3-0975e8cdfd46" containerName="copy" containerID="cri-o://9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d" gracePeriod=2 Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.008012 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7g8cc/must-gather-wx7gh"] Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.380591 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7g8cc_must-gather-wx7gh_77867a54-1bc3-485c-a7b3-0975e8cdfd46/copy/0.log" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.380925 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7g8cc/must-gather-wx7gh" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.504005 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/77867a54-1bc3-485c-a7b3-0975e8cdfd46-must-gather-output\") pod \"77867a54-1bc3-485c-a7b3-0975e8cdfd46\" (UID: \"77867a54-1bc3-485c-a7b3-0975e8cdfd46\") " Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.504235 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8f5k\" (UniqueName: \"kubernetes.io/projected/77867a54-1bc3-485c-a7b3-0975e8cdfd46-kube-api-access-q8f5k\") pod \"77867a54-1bc3-485c-a7b3-0975e8cdfd46\" (UID: \"77867a54-1bc3-485c-a7b3-0975e8cdfd46\") " Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.513400 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77867a54-1bc3-485c-a7b3-0975e8cdfd46-kube-api-access-q8f5k" (OuterVolumeSpecName: "kube-api-access-q8f5k") pod "77867a54-1bc3-485c-a7b3-0975e8cdfd46" (UID: "77867a54-1bc3-485c-a7b3-0975e8cdfd46"). InnerVolumeSpecName "kube-api-access-q8f5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.558092 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77867a54-1bc3-485c-a7b3-0975e8cdfd46-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "77867a54-1bc3-485c-a7b3-0975e8cdfd46" (UID: "77867a54-1bc3-485c-a7b3-0975e8cdfd46"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.577837 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7g8cc_must-gather-wx7gh_77867a54-1bc3-485c-a7b3-0975e8cdfd46/copy/0.log" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.578258 4870 generic.go:334] "Generic (PLEG): container finished" podID="77867a54-1bc3-485c-a7b3-0975e8cdfd46" containerID="9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d" exitCode=143 Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.578330 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7g8cc/must-gather-wx7gh" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.578361 4870 scope.go:117] "RemoveContainer" containerID="9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.597483 4870 scope.go:117] "RemoveContainer" containerID="fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.605536 4870 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/77867a54-1bc3-485c-a7b3-0975e8cdfd46-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.605571 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8f5k\" (UniqueName: \"kubernetes.io/projected/77867a54-1bc3-485c-a7b3-0975e8cdfd46-kube-api-access-q8f5k\") on node \"crc\" DevicePath \"\"" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.631262 4870 scope.go:117] "RemoveContainer" containerID="9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d" Mar 12 00:25:03 crc kubenswrapper[4870]: E0312 00:25:03.631745 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d\": container with ID starting with 9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d not found: ID does not exist" containerID="9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.631814 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d"} err="failed to get container status \"9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d\": rpc error: code = NotFound desc = could not find container \"9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d\": container with ID starting with 9dd241dad307131bd53262fc7840b86c09098cab5cd532b86f7fed30c5d8785d not found: ID does not exist" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.631843 4870 scope.go:117] "RemoveContainer" containerID="fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100" Mar 12 00:25:03 crc kubenswrapper[4870]: E0312 00:25:03.632240 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100\": container with ID starting with fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100 not found: ID does not exist" containerID="fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100" Mar 12 00:25:03 crc kubenswrapper[4870]: I0312 00:25:03.632276 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100"} err="failed to get container status \"fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100\": rpc error: code = NotFound desc = could not find container \"fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100\": container with ID starting with fef5c946b601b1fc8f26ab7d013328468ba8b80542c203209d340241c4fb4100 not found: ID does not exist" Mar 12 00:25:04 crc kubenswrapper[4870]: I0312 00:25:04.122805 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77867a54-1bc3-485c-a7b3-0975e8cdfd46" path="/var/lib/kubelet/pods/77867a54-1bc3-485c-a7b3-0975e8cdfd46/volumes" Mar 12 00:25:09 crc kubenswrapper[4870]: I0312 00:25:09.201847 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:25:09 crc kubenswrapper[4870]: I0312 00:25:09.257780 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gznsz"] Mar 12 00:25:09 crc kubenswrapper[4870]: I0312 00:25:09.616211 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gznsz" podUID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerName="registry-server" containerID="cri-o://fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246" gracePeriod=2 Mar 12 00:25:09 crc kubenswrapper[4870]: I0312 00:25:09.955845 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.105635 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-utilities\") pod \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.106093 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-catalog-content\") pod \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.106188 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67hfl\" (UniqueName: \"kubernetes.io/projected/bad79eae-c77b-4ab6-ac3f-048e08929cfc-kube-api-access-67hfl\") pod \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\" (UID: \"bad79eae-c77b-4ab6-ac3f-048e08929cfc\") " Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.109046 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-utilities" (OuterVolumeSpecName: "utilities") pod "bad79eae-c77b-4ab6-ac3f-048e08929cfc" (UID: "bad79eae-c77b-4ab6-ac3f-048e08929cfc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.114704 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad79eae-c77b-4ab6-ac3f-048e08929cfc-kube-api-access-67hfl" (OuterVolumeSpecName: "kube-api-access-67hfl") pod "bad79eae-c77b-4ab6-ac3f-048e08929cfc" (UID: "bad79eae-c77b-4ab6-ac3f-048e08929cfc"). InnerVolumeSpecName "kube-api-access-67hfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.191576 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bad79eae-c77b-4ab6-ac3f-048e08929cfc" (UID: "bad79eae-c77b-4ab6-ac3f-048e08929cfc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.207735 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.207769 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad79eae-c77b-4ab6-ac3f-048e08929cfc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.207785 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67hfl\" (UniqueName: \"kubernetes.io/projected/bad79eae-c77b-4ab6-ac3f-048e08929cfc-kube-api-access-67hfl\") on node \"crc\" DevicePath \"\"" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.624538 4870 generic.go:334] "Generic (PLEG): container finished" podID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerID="fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246" exitCode=0 Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.624662 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gznsz" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.625400 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gznsz" event={"ID":"bad79eae-c77b-4ab6-ac3f-048e08929cfc","Type":"ContainerDied","Data":"fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246"} Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.625665 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gznsz" event={"ID":"bad79eae-c77b-4ab6-ac3f-048e08929cfc","Type":"ContainerDied","Data":"476e492d10d39fadb297214ea046da46174310964e56d224832150fe087e5090"} Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.625695 4870 scope.go:117] "RemoveContainer" containerID="fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.657603 4870 scope.go:117] "RemoveContainer" containerID="4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.663921 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gznsz"] Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.669407 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gznsz"] Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.690714 4870 scope.go:117] "RemoveContainer" containerID="13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.706221 4870 scope.go:117] "RemoveContainer" containerID="fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246" Mar 12 00:25:10 crc kubenswrapper[4870]: E0312 00:25:10.706621 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246\": container with ID starting with fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246 not found: ID does not exist" containerID="fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.706644 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246"} err="failed to get container status \"fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246\": rpc error: code = NotFound desc = could not find container \"fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246\": container with ID starting with fa90190173a8ef621a4be62c1e13c7e998b88c43d15456102b8fdf341c491246 not found: ID does not exist" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.706663 4870 scope.go:117] "RemoveContainer" containerID="4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405" Mar 12 00:25:10 crc kubenswrapper[4870]: E0312 00:25:10.706983 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405\": container with ID starting with 4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405 not found: ID does not exist" containerID="4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.707000 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405"} err="failed to get container status \"4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405\": rpc error: code = NotFound desc = could not find container \"4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405\": container with ID starting with 4a667d49e171280f9bcf0c2344064a1a0d44b43e49a65b891a7c8ac81d911405 not found: ID does not exist" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.707013 4870 scope.go:117] "RemoveContainer" containerID="13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea" Mar 12 00:25:10 crc kubenswrapper[4870]: E0312 00:25:10.707423 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea\": container with ID starting with 13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea not found: ID does not exist" containerID="13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea" Mar 12 00:25:10 crc kubenswrapper[4870]: I0312 00:25:10.707448 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea"} err="failed to get container status \"13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea\": rpc error: code = NotFound desc = could not find container \"13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea\": container with ID starting with 13324f4b00d88872235632abaed145daa114fedfebe1cabfdce4f69235cb60ea not found: ID does not exist" Mar 12 00:25:12 crc kubenswrapper[4870]: I0312 00:25:12.113265 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" path="/var/lib/kubelet/pods/bad79eae-c77b-4ab6-ac3f-048e08929cfc/volumes" Mar 12 00:25:17 crc kubenswrapper[4870]: I0312 00:25:17.594410 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:25:17 crc kubenswrapper[4870]: I0312 00:25:17.595023 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:25:47 crc kubenswrapper[4870]: I0312 00:25:47.595039 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:25:47 crc kubenswrapper[4870]: I0312 00:25:47.595849 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:25:47 crc kubenswrapper[4870]: I0312 00:25:47.595929 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:25:47 crc kubenswrapper[4870]: I0312 00:25:47.596706 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31940fcb2bdca3b9e93d8f4a5594da9981665262ae31be473e93140ad64f407d"} pod="openshift-machine-config-operator/machine-config-daemon-84dfr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 00:25:47 crc kubenswrapper[4870]: I0312 00:25:47.596804 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" containerID="cri-o://31940fcb2bdca3b9e93d8f4a5594da9981665262ae31be473e93140ad64f407d" gracePeriod=600 Mar 12 00:25:47 crc kubenswrapper[4870]: I0312 00:25:47.918915 4870 generic.go:334] "Generic (PLEG): container finished" podID="988c0290-1e98-46c8-8253-a4718914b9ef" containerID="31940fcb2bdca3b9e93d8f4a5594da9981665262ae31be473e93140ad64f407d" exitCode=0 Mar 12 00:25:47 crc kubenswrapper[4870]: I0312 00:25:47.918979 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerDied","Data":"31940fcb2bdca3b9e93d8f4a5594da9981665262ae31be473e93140ad64f407d"} Mar 12 00:25:47 crc kubenswrapper[4870]: I0312 00:25:47.919406 4870 scope.go:117] "RemoveContainer" containerID="9d34e3dbb71186ce8356c02e5bee2ab1ff708583b71cba126470e3c14ba16321" Mar 12 00:25:48 crc kubenswrapper[4870]: I0312 00:25:48.931179 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerStarted","Data":"f3fe06bb11654f676c59626377b3f221fcb0d3122284fab5184552f071051f07"} Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.151281 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554586-fdctd"] Mar 12 00:26:00 crc kubenswrapper[4870]: E0312 00:26:00.152445 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerName="registry-server" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.152483 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerName="registry-server" Mar 12 00:26:00 crc kubenswrapper[4870]: E0312 00:26:00.152505 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerName="extract-content" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.152518 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerName="extract-content" Mar 12 00:26:00 crc kubenswrapper[4870]: E0312 00:26:00.152536 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77867a54-1bc3-485c-a7b3-0975e8cdfd46" containerName="gather" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.152550 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="77867a54-1bc3-485c-a7b3-0975e8cdfd46" containerName="gather" Mar 12 00:26:00 crc kubenswrapper[4870]: E0312 00:26:00.152573 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77867a54-1bc3-485c-a7b3-0975e8cdfd46" containerName="copy" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.152585 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="77867a54-1bc3-485c-a7b3-0975e8cdfd46" containerName="copy" Mar 12 00:26:00 crc kubenswrapper[4870]: E0312 00:26:00.152610 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerName="extract-utilities" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.152622 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerName="extract-utilities" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.152801 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="77867a54-1bc3-485c-a7b3-0975e8cdfd46" containerName="gather" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.152820 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="77867a54-1bc3-485c-a7b3-0975e8cdfd46" containerName="copy" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.152843 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad79eae-c77b-4ab6-ac3f-048e08929cfc" containerName="registry-server" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.153553 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554586-fdctd" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.157542 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.157557 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.157924 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554586-fdctd"] Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.164668 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9fvj8" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.234863 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl86c\" (UniqueName: \"kubernetes.io/projected/f0032417-3275-4167-821e-1c4fa8172160-kube-api-access-vl86c\") pod \"auto-csr-approver-29554586-fdctd\" (UID: \"f0032417-3275-4167-821e-1c4fa8172160\") " pod="openshift-infra/auto-csr-approver-29554586-fdctd" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.336525 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl86c\" (UniqueName: \"kubernetes.io/projected/f0032417-3275-4167-821e-1c4fa8172160-kube-api-access-vl86c\") pod \"auto-csr-approver-29554586-fdctd\" (UID: \"f0032417-3275-4167-821e-1c4fa8172160\") " pod="openshift-infra/auto-csr-approver-29554586-fdctd" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.372669 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl86c\" (UniqueName: \"kubernetes.io/projected/f0032417-3275-4167-821e-1c4fa8172160-kube-api-access-vl86c\") pod \"auto-csr-approver-29554586-fdctd\" (UID: \"f0032417-3275-4167-821e-1c4fa8172160\") " pod="openshift-infra/auto-csr-approver-29554586-fdctd" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.481248 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554586-fdctd" Mar 12 00:26:00 crc kubenswrapper[4870]: I0312 00:26:00.780955 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554586-fdctd"] Mar 12 00:26:01 crc kubenswrapper[4870]: I0312 00:26:01.018814 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554586-fdctd" event={"ID":"f0032417-3275-4167-821e-1c4fa8172160","Type":"ContainerStarted","Data":"32c45f48253c821ce617b3314eb8dfd57949650b1fdb966b4cacacfc7958dfa2"} Mar 12 00:26:03 crc kubenswrapper[4870]: I0312 00:26:03.039933 4870 generic.go:334] "Generic (PLEG): container finished" podID="f0032417-3275-4167-821e-1c4fa8172160" containerID="81fe32370bb1ab83cbb2e57101e4979b00b58969518be285d49e686354593e8f" exitCode=0 Mar 12 00:26:03 crc kubenswrapper[4870]: I0312 00:26:03.040038 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554586-fdctd" event={"ID":"f0032417-3275-4167-821e-1c4fa8172160","Type":"ContainerDied","Data":"81fe32370bb1ab83cbb2e57101e4979b00b58969518be285d49e686354593e8f"} Mar 12 00:26:04 crc kubenswrapper[4870]: I0312 00:26:04.378808 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554586-fdctd" Mar 12 00:26:04 crc kubenswrapper[4870]: I0312 00:26:04.498515 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl86c\" (UniqueName: \"kubernetes.io/projected/f0032417-3275-4167-821e-1c4fa8172160-kube-api-access-vl86c\") pod \"f0032417-3275-4167-821e-1c4fa8172160\" (UID: \"f0032417-3275-4167-821e-1c4fa8172160\") " Mar 12 00:26:04 crc kubenswrapper[4870]: I0312 00:26:04.505332 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0032417-3275-4167-821e-1c4fa8172160-kube-api-access-vl86c" (OuterVolumeSpecName: "kube-api-access-vl86c") pod "f0032417-3275-4167-821e-1c4fa8172160" (UID: "f0032417-3275-4167-821e-1c4fa8172160"). InnerVolumeSpecName "kube-api-access-vl86c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:26:04 crc kubenswrapper[4870]: I0312 00:26:04.600793 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl86c\" (UniqueName: \"kubernetes.io/projected/f0032417-3275-4167-821e-1c4fa8172160-kube-api-access-vl86c\") on node \"crc\" DevicePath \"\"" Mar 12 00:26:05 crc kubenswrapper[4870]: I0312 00:26:05.058073 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554586-fdctd" event={"ID":"f0032417-3275-4167-821e-1c4fa8172160","Type":"ContainerDied","Data":"32c45f48253c821ce617b3314eb8dfd57949650b1fdb966b4cacacfc7958dfa2"} Mar 12 00:26:05 crc kubenswrapper[4870]: I0312 00:26:05.058517 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32c45f48253c821ce617b3314eb8dfd57949650b1fdb966b4cacacfc7958dfa2" Mar 12 00:26:05 crc kubenswrapper[4870]: I0312 00:26:05.058618 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554586-fdctd" Mar 12 00:26:05 crc kubenswrapper[4870]: I0312 00:26:05.471601 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554580-j4sw5"] Mar 12 00:26:05 crc kubenswrapper[4870]: I0312 00:26:05.478596 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554580-j4sw5"] Mar 12 00:26:06 crc kubenswrapper[4870]: I0312 00:26:06.117977 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2870ede9-9765-4376-a848-1e2721d3f95c" path="/var/lib/kubelet/pods/2870ede9-9765-4376-a848-1e2721d3f95c/volumes" Mar 12 00:26:32 crc kubenswrapper[4870]: I0312 00:26:32.287665 4870 scope.go:117] "RemoveContainer" containerID="6bc7e9a83d050e78e320633876851f16890ac1132c9a1b4162c3e894a9d4be89" Mar 12 00:27:47 crc kubenswrapper[4870]: I0312 00:27:47.594766 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:27:47 crc kubenswrapper[4870]: I0312 00:27:47.595480 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.169765 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554588-gz5cb"] Mar 12 00:28:00 crc kubenswrapper[4870]: E0312 00:28:00.170986 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0032417-3275-4167-821e-1c4fa8172160" containerName="oc" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.171019 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0032417-3275-4167-821e-1c4fa8172160" containerName="oc" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.171396 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0032417-3275-4167-821e-1c4fa8172160" containerName="oc" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.172507 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554588-gz5cb" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.179103 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9fvj8" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.179395 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.179500 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.181064 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554588-gz5cb"] Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.327115 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hvr5\" (UniqueName: \"kubernetes.io/projected/63b4eae6-e67a-45ed-a411-7952a38436f3-kube-api-access-8hvr5\") pod \"auto-csr-approver-29554588-gz5cb\" (UID: \"63b4eae6-e67a-45ed-a411-7952a38436f3\") " pod="openshift-infra/auto-csr-approver-29554588-gz5cb" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.428649 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hvr5\" (UniqueName: \"kubernetes.io/projected/63b4eae6-e67a-45ed-a411-7952a38436f3-kube-api-access-8hvr5\") pod \"auto-csr-approver-29554588-gz5cb\" (UID: \"63b4eae6-e67a-45ed-a411-7952a38436f3\") " pod="openshift-infra/auto-csr-approver-29554588-gz5cb" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.480839 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hvr5\" (UniqueName: \"kubernetes.io/projected/63b4eae6-e67a-45ed-a411-7952a38436f3-kube-api-access-8hvr5\") pod \"auto-csr-approver-29554588-gz5cb\" (UID: \"63b4eae6-e67a-45ed-a411-7952a38436f3\") " pod="openshift-infra/auto-csr-approver-29554588-gz5cb" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.505398 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554588-gz5cb" Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.736295 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554588-gz5cb"] Mar 12 00:28:00 crc kubenswrapper[4870]: W0312 00:28:00.745023 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63b4eae6_e67a_45ed_a411_7952a38436f3.slice/crio-0c94a62ef29433972c08a8445e0446c376289bbd216005ccfd1098e4019fa65c WatchSource:0}: Error finding container 0c94a62ef29433972c08a8445e0446c376289bbd216005ccfd1098e4019fa65c: Status 404 returned error can't find the container with id 0c94a62ef29433972c08a8445e0446c376289bbd216005ccfd1098e4019fa65c Mar 12 00:28:00 crc kubenswrapper[4870]: I0312 00:28:00.942598 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554588-gz5cb" event={"ID":"63b4eae6-e67a-45ed-a411-7952a38436f3","Type":"ContainerStarted","Data":"0c94a62ef29433972c08a8445e0446c376289bbd216005ccfd1098e4019fa65c"} Mar 12 00:28:02 crc kubenswrapper[4870]: I0312 00:28:02.960329 4870 generic.go:334] "Generic (PLEG): container finished" podID="63b4eae6-e67a-45ed-a411-7952a38436f3" containerID="260d8d8194707636123ff0895643c42d39392176192c43e762c8ecbfbfb7d915" exitCode=0 Mar 12 00:28:02 crc kubenswrapper[4870]: I0312 00:28:02.960394 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554588-gz5cb" event={"ID":"63b4eae6-e67a-45ed-a411-7952a38436f3","Type":"ContainerDied","Data":"260d8d8194707636123ff0895643c42d39392176192c43e762c8ecbfbfb7d915"} Mar 12 00:28:04 crc kubenswrapper[4870]: I0312 00:28:04.299976 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554588-gz5cb" Mar 12 00:28:04 crc kubenswrapper[4870]: I0312 00:28:04.490665 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hvr5\" (UniqueName: \"kubernetes.io/projected/63b4eae6-e67a-45ed-a411-7952a38436f3-kube-api-access-8hvr5\") pod \"63b4eae6-e67a-45ed-a411-7952a38436f3\" (UID: \"63b4eae6-e67a-45ed-a411-7952a38436f3\") " Mar 12 00:28:04 crc kubenswrapper[4870]: I0312 00:28:04.506101 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b4eae6-e67a-45ed-a411-7952a38436f3-kube-api-access-8hvr5" (OuterVolumeSpecName: "kube-api-access-8hvr5") pod "63b4eae6-e67a-45ed-a411-7952a38436f3" (UID: "63b4eae6-e67a-45ed-a411-7952a38436f3"). InnerVolumeSpecName "kube-api-access-8hvr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:28:04 crc kubenswrapper[4870]: I0312 00:28:04.593000 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hvr5\" (UniqueName: \"kubernetes.io/projected/63b4eae6-e67a-45ed-a411-7952a38436f3-kube-api-access-8hvr5\") on node \"crc\" DevicePath \"\"" Mar 12 00:28:04 crc kubenswrapper[4870]: I0312 00:28:04.995036 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554588-gz5cb" event={"ID":"63b4eae6-e67a-45ed-a411-7952a38436f3","Type":"ContainerDied","Data":"0c94a62ef29433972c08a8445e0446c376289bbd216005ccfd1098e4019fa65c"} Mar 12 00:28:04 crc kubenswrapper[4870]: I0312 00:28:04.995075 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c94a62ef29433972c08a8445e0446c376289bbd216005ccfd1098e4019fa65c" Mar 12 00:28:04 crc kubenswrapper[4870]: I0312 00:28:04.995108 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554588-gz5cb" Mar 12 00:28:05 crc kubenswrapper[4870]: I0312 00:28:05.381548 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554582-8dlv7"] Mar 12 00:28:05 crc kubenswrapper[4870]: I0312 00:28:05.390888 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554582-8dlv7"] Mar 12 00:28:06 crc kubenswrapper[4870]: I0312 00:28:06.112324 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51df2778-c451-4b96-9c6f-e21248f0945f" path="/var/lib/kubelet/pods/51df2778-c451-4b96-9c6f-e21248f0945f/volumes" Mar 12 00:28:17 crc kubenswrapper[4870]: I0312 00:28:17.594689 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:28:17 crc kubenswrapper[4870]: I0312 00:28:17.595419 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:28:32 crc kubenswrapper[4870]: I0312 00:28:32.364951 4870 scope.go:117] "RemoveContainer" containerID="bc02f3933e8b2081a12f300268c2f15143928208bc939b386caae7fe9c056ce8" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.279134 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w9dzk"] Mar 12 00:28:45 crc kubenswrapper[4870]: E0312 00:28:45.280092 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b4eae6-e67a-45ed-a411-7952a38436f3" containerName="oc" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.280114 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b4eae6-e67a-45ed-a411-7952a38436f3" containerName="oc" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.280354 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b4eae6-e67a-45ed-a411-7952a38436f3" containerName="oc" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.281771 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.311336 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9dzk"] Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.437728 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-utilities\") pod \"certified-operators-w9dzk\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.438292 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-catalog-content\") pod \"certified-operators-w9dzk\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.438393 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kth9l\" (UniqueName: \"kubernetes.io/projected/6f07c0f2-b575-4ba0-a32e-9701fce9705c-kube-api-access-kth9l\") pod \"certified-operators-w9dzk\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.539935 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-catalog-content\") pod \"certified-operators-w9dzk\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.540011 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kth9l\" (UniqueName: \"kubernetes.io/projected/6f07c0f2-b575-4ba0-a32e-9701fce9705c-kube-api-access-kth9l\") pod \"certified-operators-w9dzk\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.540211 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-utilities\") pod \"certified-operators-w9dzk\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.540608 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-catalog-content\") pod \"certified-operators-w9dzk\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.540811 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-utilities\") pod \"certified-operators-w9dzk\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.568012 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kth9l\" (UniqueName: \"kubernetes.io/projected/6f07c0f2-b575-4ba0-a32e-9701fce9705c-kube-api-access-kth9l\") pod \"certified-operators-w9dzk\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.614076 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:45 crc kubenswrapper[4870]: I0312 00:28:45.959501 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9dzk"] Mar 12 00:28:46 crc kubenswrapper[4870]: I0312 00:28:46.318709 4870 generic.go:334] "Generic (PLEG): container finished" podID="6f07c0f2-b575-4ba0-a32e-9701fce9705c" containerID="1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741" exitCode=0 Mar 12 00:28:46 crc kubenswrapper[4870]: I0312 00:28:46.318821 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9dzk" event={"ID":"6f07c0f2-b575-4ba0-a32e-9701fce9705c","Type":"ContainerDied","Data":"1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741"} Mar 12 00:28:46 crc kubenswrapper[4870]: I0312 00:28:46.319050 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9dzk" event={"ID":"6f07c0f2-b575-4ba0-a32e-9701fce9705c","Type":"ContainerStarted","Data":"21002c5f93c465745e8a23716cb73a63525544352685ef6e435337389162496e"} Mar 12 00:28:47 crc kubenswrapper[4870]: I0312 00:28:47.328462 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9dzk" event={"ID":"6f07c0f2-b575-4ba0-a32e-9701fce9705c","Type":"ContainerStarted","Data":"a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f"} Mar 12 00:28:47 crc kubenswrapper[4870]: I0312 00:28:47.594609 4870 patch_prober.go:28] interesting pod/machine-config-daemon-84dfr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 00:28:47 crc kubenswrapper[4870]: I0312 00:28:47.594692 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 00:28:47 crc kubenswrapper[4870]: I0312 00:28:47.594753 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" Mar 12 00:28:47 crc kubenswrapper[4870]: I0312 00:28:47.595505 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3fe06bb11654f676c59626377b3f221fcb0d3122284fab5184552f071051f07"} pod="openshift-machine-config-operator/machine-config-daemon-84dfr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 00:28:47 crc kubenswrapper[4870]: I0312 00:28:47.595621 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" podUID="988c0290-1e98-46c8-8253-a4718914b9ef" containerName="machine-config-daemon" containerID="cri-o://f3fe06bb11654f676c59626377b3f221fcb0d3122284fab5184552f071051f07" gracePeriod=600 Mar 12 00:28:48 crc kubenswrapper[4870]: I0312 00:28:48.337752 4870 generic.go:334] "Generic (PLEG): container finished" podID="6f07c0f2-b575-4ba0-a32e-9701fce9705c" containerID="a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f" exitCode=0 Mar 12 00:28:48 crc kubenswrapper[4870]: I0312 00:28:48.337835 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9dzk" event={"ID":"6f07c0f2-b575-4ba0-a32e-9701fce9705c","Type":"ContainerDied","Data":"a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f"} Mar 12 00:28:48 crc kubenswrapper[4870]: I0312 00:28:48.344341 4870 generic.go:334] "Generic (PLEG): container finished" podID="988c0290-1e98-46c8-8253-a4718914b9ef" containerID="f3fe06bb11654f676c59626377b3f221fcb0d3122284fab5184552f071051f07" exitCode=0 Mar 12 00:28:48 crc kubenswrapper[4870]: I0312 00:28:48.344408 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerDied","Data":"f3fe06bb11654f676c59626377b3f221fcb0d3122284fab5184552f071051f07"} Mar 12 00:28:48 crc kubenswrapper[4870]: I0312 00:28:48.344452 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-84dfr" event={"ID":"988c0290-1e98-46c8-8253-a4718914b9ef","Type":"ContainerStarted","Data":"b5482185404db1948226ac5b3e2e429b8a7cef6e836ed020640f8ffe120d0704"} Mar 12 00:28:48 crc kubenswrapper[4870]: I0312 00:28:48.344469 4870 scope.go:117] "RemoveContainer" containerID="31940fcb2bdca3b9e93d8f4a5594da9981665262ae31be473e93140ad64f407d" Mar 12 00:28:49 crc kubenswrapper[4870]: I0312 00:28:49.360605 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9dzk" event={"ID":"6f07c0f2-b575-4ba0-a32e-9701fce9705c","Type":"ContainerStarted","Data":"d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea"} Mar 12 00:28:49 crc kubenswrapper[4870]: I0312 00:28:49.392865 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w9dzk" podStartSLOduration=1.963167992 podStartE2EDuration="4.392848386s" podCreationTimestamp="2026-03-12 00:28:45 +0000 UTC" firstStartedPulling="2026-03-12 00:28:46.320209871 +0000 UTC m=+1216.923626181" lastFinishedPulling="2026-03-12 00:28:48.749890225 +0000 UTC m=+1219.353306575" observedRunningTime="2026-03-12 00:28:49.391932399 +0000 UTC m=+1219.995348719" watchObservedRunningTime="2026-03-12 00:28:49.392848386 +0000 UTC m=+1219.996264706" Mar 12 00:28:55 crc kubenswrapper[4870]: I0312 00:28:55.615126 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:55 crc kubenswrapper[4870]: I0312 00:28:55.616069 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:55 crc kubenswrapper[4870]: I0312 00:28:55.675446 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:56 crc kubenswrapper[4870]: I0312 00:28:56.483257 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:56 crc kubenswrapper[4870]: I0312 00:28:56.545469 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9dzk"] Mar 12 00:28:58 crc kubenswrapper[4870]: I0312 00:28:58.441920 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w9dzk" podUID="6f07c0f2-b575-4ba0-a32e-9701fce9705c" containerName="registry-server" containerID="cri-o://d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea" gracePeriod=2 Mar 12 00:28:58 crc kubenswrapper[4870]: I0312 00:28:58.983867 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.134418 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kth9l\" (UniqueName: \"kubernetes.io/projected/6f07c0f2-b575-4ba0-a32e-9701fce9705c-kube-api-access-kth9l\") pod \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.134467 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-utilities\") pod \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.134491 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-catalog-content\") pod \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\" (UID: \"6f07c0f2-b575-4ba0-a32e-9701fce9705c\") " Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.135717 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-utilities" (OuterVolumeSpecName: "utilities") pod "6f07c0f2-b575-4ba0-a32e-9701fce9705c" (UID: "6f07c0f2-b575-4ba0-a32e-9701fce9705c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.140370 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f07c0f2-b575-4ba0-a32e-9701fce9705c-kube-api-access-kth9l" (OuterVolumeSpecName: "kube-api-access-kth9l") pod "6f07c0f2-b575-4ba0-a32e-9701fce9705c" (UID: "6f07c0f2-b575-4ba0-a32e-9701fce9705c"). InnerVolumeSpecName "kube-api-access-kth9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.189534 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f07c0f2-b575-4ba0-a32e-9701fce9705c" (UID: "6f07c0f2-b575-4ba0-a32e-9701fce9705c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.236546 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kth9l\" (UniqueName: \"kubernetes.io/projected/6f07c0f2-b575-4ba0-a32e-9701fce9705c-kube-api-access-kth9l\") on node \"crc\" DevicePath \"\"" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.236582 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.236597 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f07c0f2-b575-4ba0-a32e-9701fce9705c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.452452 4870 generic.go:334] "Generic (PLEG): container finished" podID="6f07c0f2-b575-4ba0-a32e-9701fce9705c" containerID="d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea" exitCode=0 Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.452537 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9dzk" event={"ID":"6f07c0f2-b575-4ba0-a32e-9701fce9705c","Type":"ContainerDied","Data":"d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea"} Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.452577 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9dzk" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.452612 4870 scope.go:117] "RemoveContainer" containerID="d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.452590 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9dzk" event={"ID":"6f07c0f2-b575-4ba0-a32e-9701fce9705c","Type":"ContainerDied","Data":"21002c5f93c465745e8a23716cb73a63525544352685ef6e435337389162496e"} Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.474003 4870 scope.go:117] "RemoveContainer" containerID="a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.499539 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9dzk"] Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.509708 4870 scope.go:117] "RemoveContainer" containerID="1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.510228 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w9dzk"] Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.538007 4870 scope.go:117] "RemoveContainer" containerID="d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea" Mar 12 00:28:59 crc kubenswrapper[4870]: E0312 00:28:59.538472 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea\": container with ID starting with d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea not found: ID does not exist" containerID="d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.538509 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea"} err="failed to get container status \"d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea\": rpc error: code = NotFound desc = could not find container \"d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea\": container with ID starting with d89f3a5b4b3d827922543cb5e3440a50a095dda026484b7a2f717eb5586530ea not found: ID does not exist" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.538554 4870 scope.go:117] "RemoveContainer" containerID="a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f" Mar 12 00:28:59 crc kubenswrapper[4870]: E0312 00:28:59.538948 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f\": container with ID starting with a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f not found: ID does not exist" containerID="a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.539036 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f"} err="failed to get container status \"a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f\": rpc error: code = NotFound desc = could not find container \"a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f\": container with ID starting with a018147c4ac0b229311233f0cf3db9109c6ff8373905ddb5408a1ae7bbe4231f not found: ID does not exist" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.539095 4870 scope.go:117] "RemoveContainer" containerID="1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741" Mar 12 00:28:59 crc kubenswrapper[4870]: E0312 00:28:59.540058 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741\": container with ID starting with 1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741 not found: ID does not exist" containerID="1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741" Mar 12 00:28:59 crc kubenswrapper[4870]: I0312 00:28:59.540086 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741"} err="failed to get container status \"1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741\": rpc error: code = NotFound desc = could not find container \"1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741\": container with ID starting with 1d199deddfc4e33f76cd92bfc24f711d24c1010fee8aa16153a6782b14528741 not found: ID does not exist" Mar 12 00:29:00 crc kubenswrapper[4870]: I0312 00:29:00.119329 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f07c0f2-b575-4ba0-a32e-9701fce9705c" path="/var/lib/kubelet/pods/6f07c0f2-b575-4ba0-a32e-9701fce9705c/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515154404334024450 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015154404335017366 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015154401421016502 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015154401421015452 5ustar corecore